Global

Background : Cross border- trade has been seen to have gender dimension. Women are seen to be engaged in informal cross-border trade along the borders. Most times women are facing a lot of challenges in trying to transport their goods from customs officials and other security agents. Despite these difficulties, they still engage in this informal trade along this border. This paper examines the coping strategies of women involved in informal cross-border trade. Methods : This study is based on Focus Group Discussion (FGDs) conducted with 50 informal traders, in-depth-interviews conducted with security agents and drivers along Lagos- Seme border.
The muga silk industry of Assam has been in existence since time immemorial. In Assam, muga silk weaving is an ancient craft, though there is no definite and precise mention of the time of its origin. Due to lack of definite and authentic contemporary historical accounts, different Scholars have drawn different opinions and conclusions regarding the origin of muga culture. Ahom regime (1228-1828) can be considered as the golden period for muga culture of Assam, which prospered and thrived and had become a part of social and economic life of the Assamese people. Due to immense co-operation and initiative from Ahom kings, the rearers, reelers & weavers became skillful and the industry grew rapidly. An attempt has been made to study the historical perspectives of muga silk industry in Assam and its present status.
The present investigation deals with the deformation of various sources in fluid saturated porous medium with incompressible fluid. The normal mode analysis is used to obtain the components of displacement, stress and pore pressure. The variations of normal stress, tangential stress and pore pressure with the distance x has been shown graphically. A particular case of interest has also been deduced from the present investigation.
Data mining based information processing in Wireless Sensor Network (WSN) is at its preliminary stage, as compared to traditional machine learning and WSN. Currently researches mainly focus on applying machine learning techniques to solve a particular problem in WSN. Different researchers will have different assumptions, application scenarios and preferences in applying machine learning algorithms. These differences represent a major challenge in allowing researchers to build upon each other’s work so that research results will accumulate in the community. Thus, a common architecture across the WSN machine learning community would be necessary. One of the major objectives of many WSN research works is to improve or optimize the performance of the entire network in terms of energy conservation and network lifetime. This paper will survey Data Mining in WSN application from two perspectives, namely the Network associated issue and Application associated issue. In the Network associated issue, different machine learning algorithms applied in WSNs to enhance network performance will be discussed. In Application associated issue, machine learning methods that have been used for information processing in WSNs will be summarized.
GIS, Feeder Manager, CCC, SCADA, HT Network, LT Network.
It is important to deliver appropriate services to requested users. In case of unavailability of a user requestedcomposite service, enforces the system to invoke service selection that involves choosing individual concrete services towards service composition. The services are selected based on two criteria: i) functional based and ii) nonfunctional based. The former entails selection of services based on functional property that the service is dedicated to do and the latter elite selection of services based on the QoS attributes such as reliability, availability, cost, and response time. Several population-based and swarm-based optimization algorithms are widely used for the process of web service selection. In this work, we employ a stochastic optimization algorithm called Self Organizing Migrating Algorithm (SOMA) and compare its performance with GA and PSO. The comparative study evidences that SOMA produces promising results and is therefore able to select user requested service in an efficient manner.
In this paper, the application of artificial neural network, Angstrom-Prescott and multiple regressions models to study the estimation of global solar radiation in Warri, Nigeria for a time period of seventeen years were carried out. Our study based on Multi-Layer Perceptron (MLP) of artificial neural network was trained and tested using seventeen years (1991-2007) meteorological data. The error results and statistical analysis shows that MLP network has the minimum forecasting error and can be considered as a better model to estimate global solar radiation in Warri compare to the estimation from multiple regressions and Angstrom-Prescott models.
Parallel computing is related to the application of many computers running in parallel to solve computationally intensive problems. One of the biggest issues in parallel computing is efficient task scheduling. In this paper, we survey the algorithms that allocate a parallel program represented by an edge-directed acyclic graph (DAG) to a set of homogenous processors with the objective of minimizing the completion time. We examine several such classes of algorithms and then compare the performance of a class of scheduling algorithms known as the bounded number of processors (BNP) scheduling algorithms. Comparison is based on various scheduling parameters such as makespan, speed up, processor utilization and scheduled length ratio. The main focus is given on measuring the impact of increasing the number of tasks and processors on the performance of these four BNP scheduling algorithms.
Time is the most important resource for any activity. Software development is no exception. With an increase in the competition in the market, it is very essential for the companies to release their products into the market at the earliest with good quality to earn profit. In order to develop the products early, companies implement various techniques like Rapid Application development and implement Agile methodologies like Scrum & Xtreme Programming to obtain tangible and useful features of the software at the earliest. It is just not sufficient to develop the product, it is more important to develop a quality product. But how does one know if the product is of good quality or not? This question is best answered by the Quality Assurance team which Tests the product end to end against requirements and standards. They conduct various kinds of tests and check the behavior of the system/ product in various conditions. If it passes all kinds of tests (which is dependent on the kind of the product or the system), then the QA team assures that the product is fit for use. Hence it is very important for the QA time to be given enough time to test the product/system before it is released into market. But it is difficult to judge as to how much time is required to test a product / system completely. An ideal answer would be “Years”. Time is a major constraint in any software activity. Many software projects are time bound. Then how does one ensure that within a given time, the product/system is tested to the level where a confidence can be achieved? The answer is one needs to adopt faster means of testing. Many Testing techniques are available which in help achieving this objective. The major drawback of some of the proved techniques is these techniques are dependent on the kind of the system that is being tested. So what are the other ways available to save time for testing? The answer is in the question itself. One can save some time in optimizing the way of testing itse