Search results for: kernel modules
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 751

Search results for: kernel modules

601 Modelling the Choice of Global Systems of Mobile Networks in Nigeria Using the Analytical Hierarchy Process

Authors: Awal Liman Sale

Abstract:

The world is fast becoming a global village; and a necessary tool for this process is communication, of which telecommunication is a key player. The quantum development is very rapid as one innovation replaces another in a matter of weeks. Interconnected phone calls across the different Nigerian Telecom service providers are mostly difficult to connect and often diverted, incurring unnecessary charges on the customers. This compels the consumers to register and use multiple subscriber information modules (SIM) so that they can switch to another if one fails. This study aims to identify and prioritize the key factors in selecting telecom service providers by subscribers in Nigeria using the Analytical Hierarchy Process (AHP) in order to match the factors with the GSM network providers and create a hierarchical structure. Opinions of 400 random subscribers of different service providers will be sought using the questionnaire. In general, four components and ten sub-components will be examined in this study. After determining the weight of these components, the importance of each in choosing the service will be prioritized in Nigeria.

Keywords: analytical hierarchy process, global village, Nigerian telecommunication, subscriber information modules

Procedia PDF Downloads 224
600 A Formal Property Verification for Aspect-Oriented Programs in Software Development

Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb

Abstract:

Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.

Keywords: aspect-oriented programming, control flow graph, property verification, satisfiability modulo theories

Procedia PDF Downloads 161
599 1H-NMR Spectra of Diesel-Biodiesel Blends to Evaluate the Quality and Determine the Adulteration of Biodiesel with Vegetable Oil

Authors: Luis F. Bianchessi, Gustavo G. Shimamoto, Matthieu Tubino

Abstract:

The use of biodiesel has been diffused in Brazil and all over the world by the trading of biodiesel (B100). In Brazil, the diesel oil currently being sold is a blend, containing 7% biodiesel (B7). In this context, it is necessary to develop methods capable of identifying this blend composition, especially regarding the biodiesel quality used for making these blends. In this study, hydrogen nuclear magnetic resonance spectra (1H-NMR) are proposed as a form of identifying and confirming the quality of type B10 blends (10% of biodiesel and 90% of diesel). Furthermore, the presence of vegetable oils, which may be from fuel adulteration or as an evidence of low degree of transesterification conversion during the synthesis of B100, may also be identified. Mixtures of diesel, vegetable oils and their respective biodiesel were prepared. Soybean oil and macauba kernel oil were used as raw material. The diesel proportion remained fixed at 90%. The other proportion (10%) was varied in terms of vegetable oil and biodiesel. The 1H-NMR spectra were obtained for each one of the mixtures, in order to find a correlation between the spectra and the amount of biodiesel, as well as the amount of residual vegetable oil. The ratio of the integral of the methylenic hydrogen H-2 of glycerol (exclusive of vegetable oil) with respect to the integral of the olefinic hydrogens (present in vegetable oil and biodiesel) was obtained. These ratios were correlated with the percentage of vegetable oil in each mixture, from 0% to 10%. The obtained correlation could be described by linear relationships with R2 of 0.9929 for soybean biodiesel and 0.9982 for macauba kernel biodiesel. Preliminary results show that the technique can be used to monitor the biodiesel quality in commercial diesel-biodiesel blends, besides indicating possible adulteration.

Keywords: biodiesel, diesel, biodiesel quality, adulteration

Procedia PDF Downloads 607
598 Computation of ΔV Requirements for Space Debris Removal Using Orbital Transfer

Authors: Sadhvi Gupta, Charulatha S.

Abstract:

Since the dawn of the early 1950s humans have launched numerous vehicles in space. Be it from rockets to rovers humans have done tremendous growth in the technology sector. While there is mostly upside for it for humans the only major downside which cannot be ignored now is the amount of junk produced in space due to it i.e. space debris. All this space junk amounts from objects we launch from earth which so remains in orbit until it re-enters the atmosphere. Space debris can be of various sizes mainly the big ones are of the dead satellites floating in space and small ones can consist of various things like paint flecks, screwdrivers, bolts etc. Tracking of small space debris whose size is less than 10 cm is impossible and can have vast implications. As the amount of space debris increases in space the chances of it hitting a functional satellite also increases. And it is extremely costly to repair or recover the satellite once hit by a revolving space debris. So the proposed solution is, Actively removing space debris while keeping space sustainability in mind. For this solution a total of 8 modules will be launched in LEO and in GEO and these models will be placed in their desired orbits through Hohmann transfer and for that calculating ΔV values is crucial. After which the modules will be placed in their designated positions in STK software and thorough analysis is conducted.

Keywords: space debris, Hohmann transfer, STK, delta-V

Procedia PDF Downloads 75
597 Optimal Maintenance Policy for a Three-Unit System

Authors: A. Abbou, V. Makis, N. Salari

Abstract:

We study the condition-based maintenance (CBM) problem of a system subject to stochastic deterioration. The system is composed of three units (or modules): (i) Module 1 deterioration follows a Markov process with two operational states and one failure state. The operational states are partially observable through periodic condition monitoring. (ii) Module 2 deterioration follows a Gamma process with a known failure threshold. The deterioration level of this module is fully observable through periodic inspections. (iii) Only the operating age information is available of Module 3. The lifetime of this module has a general distribution. A CBM policy prescribes when to initiate a maintenance intervention and which modules to repair during intervention. Our objective is to determine the optimal CBM policy minimizing the long-run expected average cost of operating the system. This is achieved by formulating a Markov decision process (MDP) and developing the value iteration algorithm for solving the MDP. We provide numerical examples illustrating the cost-effectiveness of the optimal CBM policy through a comparison with heuristic policies commonly found in the literature.

Keywords: reliability, maintenance optimization, Markov decision process, heuristics

Procedia PDF Downloads 206
596 Development of a Methodology for Surgery Planning and Control: A Management Approach to Handle the Conflict of High Utilization and Low Overtime

Authors: Timo Miebach, Kirsten Hoeper, Carolin Felix

Abstract:

In times of competitive pressures and demographic change, hospitals have to reconsider their strategies as a company. Due to the fact, that operations are one of the main income and one of the primary cost drivers otherwise, a process-oriented approach and an efficient use of resources seems to be the right way for getting a consistent market position. Thus, the efficient operation room occupancy planning is an important cause variable for the success and continued the existence of these institutions. A high utilization of resources is essential. This means a very high, but nevertheless sensible capacity-oriented utilization of working systems that can be realized by avoiding downtimes and a thoughtful occupancy planning. This engineering approach should help hospitals to reach her break-even point. Firstly, the aim is to establish a strategy point, which can be used for the generation of a planned throughput time. Secondly, the operation planning and control should be facilitated and implemented accurately by the generation of time modules. More than 100,000 data records of the Hannover Medical School were analyzed. The data records contain information about the type of conducted operation, the duration of the individual process steps, and all other organizational-specific data such as an operating room. Based on the aforementioned data base, a generally valid model was developed by an analysis to define a strategy point which takes the conflict of capacity utilization and low overtime into account. Furthermore, time modules were generated in this work, which allows a simplified and flexible operation planning and control for the operation manager. By the time modules, it is possible to reduce a high average value of the idle times of the operation rooms. Furthermore, the potential is used to minimize the idle time spread.

Keywords: capacity, operating room, surgery planning and control, utilization

Procedia PDF Downloads 243
595 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: computational social science, movie preference, machine learning, SVM

Procedia PDF Downloads 249
594 Food Processing Technology and Packaging: A Case Study of Indian Cashew-Nut Industry

Authors: Parashram Jakappa Patil

Abstract:

India is the global leader in world cashew business and cashew-nut industry is one of the important food processing industries in world. However India is the largest producer, processor, exporter and importer eschew in the world. India is providing cashew to the rest of the world. India is meeting world demand of cashew. India has a tremendous potential of cashew production and export to other countries. Every year India earns more than 2000 cores rupees through cashew trade. Cashew industry is one of the important small scale industries in the country which is playing significant role in rural development. It is generating more than 400000 jobs at remote area and 95% cashew worker are women, it is giving income to poor cashew farmers, majority cashew processing units are small and cottage, it is helping to stop migration from young farmers for employment opportunities, it is motivation rural entrepreneurship development and it is also helping to environment protection etc. Hence India cashew business is very important agribusiness in India which has potential make inclusive development. World Bank and IMF recognized cashew-nut industry is one the important tool for poverty eradication at global level. It shows important of cashew business and its strong existence in India. In spite of such huge potential cashew processing industry is facing different problems such as lack of infrastructure ability, lack of supply of raw cashew, lack of availability of finance, collection of raw cashew, unavailability of warehouse, marketing of cashew kernels, lack of technical knowledge and especially processing technology and packaging of finished products. This industry has great prospects such as scope for more cashew cultivation and cashew production, employment generation, formation of cashew processing units, alcohols production from cashew apple, shield oil production, rural development, poverty elimination, development of social and economic backward class and environment protection etc. This industry has domestic as well as foreign market; India has tremendous potential in this regard. The cashew is a poor men’s crop but rich men’s food. The cashew is a source of income and livelihood for poor farmers. Cashew-nut industry may play very important role in the development of hilly region. The objectives of this paper are to identify problems of cashew processing and use of processing technology, problems of cashew kernel packaging, evolving of cashew processing technology over the year and its impact on final product and impact of good processing by adopting appropriate technology packaging on international trade of cashew-nut. The most important problem of cashew processing industry is that is processing and packaging. Bad processing reduce the quality of cashew kernel at large extent especially broken of cashew kernel which has very less price in market compare to whole cashew kernel and not eligible for export. On the other hand if there is no good packaging of cashew kernel will get moisture which destroy test of it. International trade of cashew-nut is depend of two things one is cashew processing and other is packaging. This study has strong relevance because cashew-nut industry is the labour oriented, where processing technology is not playing important role because 95% processing work is manual. Hence processing work was depending on physical performance of worker which makes presence of large workforce inevitable. There are many cashew processing units closed because they are not getting sufficient work force. However due to advancement in technology slowly this picture is changing and processing work get improve. Therefore it is interesting to explore all the aspects in context of cashew processing and packaging of cashew business.

Keywords: cashew, processing technology, packaging, international trade, change

Procedia PDF Downloads 409
593 Construction of Microbial Fuel Cells from Local Benthic Zones

Authors: Maria Luiza D. Ramiento, Maria Lissette D. Lucas

Abstract:

Electricity is said to serve as the backbone of modern technology. Considering this, electricity consumption has dynamically grown due to the continuous demand. An alternative producer of energy concerning electricity must therefore be given focus. Microbial fuel cell wholly characterizes a new method of renewable energy recovery: the direct conversion of organic matter to electricity using bacteria. Electricity is produced as fuel or new food is given to the bacteria. The study concentrated in determining the feasibility of electricity production from local benthic zones. Microbial fuel cells were constructed to harvest the possible electricity and to test the presence of electricity producing microorganisms. Soil samples were gathered from Calumpang River, Palawan Mangrove Forest, Rosario River and Batangas Port. Eleven modules were constructed for the different trials of the soil samples. These modules were made of cathode and anode chambers connected by a salt bridge. For 85 days, the harvested voltage was measured daily. No parameter is added for the first 24 days. For the next 61 days, acetic acid was included in the first and second trials of the modules. Each of the trials of the soil samples gave a positive result in electricity production.There were electricity producing microbes in local benthic zones. It is observed that the higher the organic content of the soil sample, the higher the electricity harvested from it. It is recommended to identify the specific species of the electricity-producing microorganism present in the local benthic zone. Complement experiments are encouraged like determining the kind of soil particles to test its effect on the amount electricity that can be harvested. To pursue the development of microbial fuel cells by building a closed circuit in it is also suggested.

Keywords: microbial fuel cell, benthic zone, electricity, reduction-oxidation reaction, bacteria

Procedia PDF Downloads 383
592 RS Based SCADA System for Longer Distance Powered Devices

Authors: Harkishen Singh, Gavin Mangeni

Abstract:

This project aims at building an efficient and automatic power monitoring SCADA system, which is capable of monitoring the electrical parameters of high voltage powered devices in real time for example RMS voltage and current, frequency, energy consumed, power factor etc. The system uses RS-485 serial communication interface to transfer data over longer distances. Embedded C programming is the platform used to develop two hardware modules namely: RTU and Master Station modules, which both use the CC2540 BLE 4.0 microcontroller configured in slave / master mode. The Si8900 galvanic ally isolated microchip is used to perform ADC externally. The hardware communicates via UART port and sends data to the user PC using the USB port. Labview software is used to design a user interface to display current state of the power loads being monitored as well as logs data to excel spreadsheet file. An understanding of the Si8900’s auto baud rate process is key to successful implementation of this project.

Keywords: SCADA, RS485, CC2540, labview, Si8900

Procedia PDF Downloads 289
591 Efficiency Enhancement of Photovoltaic Panels Using an Optimised Air Cooled Heat Sink

Authors: Wisam K. Hussam, Ali Alfeeli, Gergory J. Sheard

Abstract:

Solar panels that use photovoltaic (PV) cells are popular for converting solar radiation into electricity. One of the major problems impacting the performance of PV panels is the overheating caused by excessive solar radiation and high ambient temperatures, which degrades the efficiency of the PV panels remarkably. To overcome this issue, an aluminum heat sink was used to dissipate unwanted heat from PV cells. The dimensions of the heat sink were determined considering the optimal fin spacing that fulfils hot climatic conditions. In this study, the effects of cooling on the efficiency and power output of a PV panel were studied experimentally. Two PV modules were used: one without and one with a heat sink. The experiments ran for 11 hours from 6:00 a.m. to 5:30 p.m. where temperature readings in the rear and front of both PV modules were recorded at an interval of 15 minutes using sensors and an Arduino microprocessor. Results are recorded for both panels simultaneously for analysis, temperate comparison, and for power and efficiency calculations. A maximum increase in the solar to electrical conversion efficiency of 35% and almost 55% in the power output were achieved with the use of a heat sink, while temperatures at the front and back of the panel were reduced by 9% and 11%, respectively.

Keywords: photovoltaic cell, natural convection, heat sink, efficiency

Procedia PDF Downloads 139
590 Estimation of a Finite Population Mean under Random Non Response Using Improved Nadaraya and Watson Kernel Weights

Authors: Nelson Bii, Christopher Ouma, John Odhiambo

Abstract:

Non-response is a potential source of errors in sample surveys. It introduces bias and large variance in the estimation of finite population parameters. Regression models have been recognized as one of the techniques of reducing bias and variance due to random non-response using auxiliary data. In this study, it is assumed that random non-response occurs in the survey variable in the second stage of cluster sampling, assuming full auxiliary information is available throughout. Auxiliary information is used at the estimation stage via a regression model to address the problem of random non-response. In particular, the auxiliary information is used via an improved Nadaraya-Watson kernel regression technique to compensate for random non-response. The asymptotic bias and mean squared error of the estimator proposed are derived. Besides, a simulation study conducted indicates that the proposed estimator has smaller values of the bias and smaller mean squared error values compared to existing estimators of finite population mean. The proposed estimator is also shown to have tighter confidence interval lengths at a 95% coverage rate. The results obtained in this study are useful, for instance, in choosing efficient estimators of the finite population mean in demographic sample surveys.

Keywords: mean squared error, random non-response, two-stage cluster sampling, confidence interval lengths

Procedia PDF Downloads 124
589 Performance Evaluation of Grid Connected Photovoltaic System

Authors: Abdulkadir Magaji

Abstract:

This study analyzes and compares the actual measured and simulated performance of a 3.2 kwP grid-connected photovoltaic system. The system is located at the Outdoor Facility of Government Day secondary School Katsina State, which lies approximately between coordinate of 12°15′N 7°30′E. The system consists of 14 Mono crystalline silicon modules connected in two strings of 7 series-connected modules, each facing north at a fixed tilt of 340. The data presented in this study were measured in the year 2015, where the system supplied a total of 4628 kWh to the local electric utility grid. The performance of the system was simulated using PVsyst software using measured and Meteonorm derived climate data sets (solar radiation, ambient temperature and wind speed). The comparison between measured and simulated energy yield are discussed. Although, both simulation results were similar, better comparison between measured and predicted monthly energy yield is observed with simulation performed using measured weather data at the site. The measured performance ratio in the present study shows 58.4% is higher than those reported elsewhere as compared in the study.

Keywords: performance, evaluation, grid connection, photovoltaic system

Procedia PDF Downloads 172
588 Design of a Real Time Closed Loop Simulation Test Bed on a General Purpose Operating System: Practical Approaches

Authors: Pratibha Srivastava, Chithra V. J., Sudhakar S., Nitin K. D.

Abstract:

A closed-loop system comprises of a controller, a response system, and an actuating system. The controller, which is the system under test for us, excites the actuators based on feedback from the sensors in a periodic manner. The sensors should provide the feedback to the System Under Test (SUT) within a deterministic time post excitation of the actuators. Any delay or miss in the generation of response or acquisition of excitation pulses may lead to control loop controller computation errors, which can be catastrophic in certain cases. Such systems categorised as hard real-time systems that need special strategies. The real-time operating systems available in the market may be the best solutions for such kind of simulations, but they pose limitations like the availability of the X Windows system, graphical interfaces, other user tools. In this paper, we present strategies that can be used on a general purpose operating system (Bare Linux Kernel) to achieve a deterministic deadline and hence have the added advantages of a GPOS with real-time features. Techniques shall be discussed how to make the time-critical application run with the highest priority in an uninterrupted manner, reduced network latency for distributed architecture, real-time data acquisition, data storage, and retrieval, user interactions, etc.

Keywords: real time data acquisition, real time kernel preemption, scheduling, network latency

Procedia PDF Downloads 135
587 A Study on the Performance of 2-PC-D Classification Model

Authors: Nurul Aini Abdul Wahab, Nor Syamim Halidin, Sayidatina Aisah Masnan, Nur Izzati Romli

Abstract:

There are many applications of principle component method for reducing the large set of variables in various fields. Fisher’s Discriminant function is also a popular tool for classification. In this research, the researcher focuses on studying the performance of Principle Component-Fisher’s Discriminant function in helping to classify rice kernels to their defined classes. The data were collected on the smells or odour of the rice kernel using odour-detection sensor, Cyranose. 32 variables were captured by this electronic nose (e-nose). The objective of this research is to measure how well a combination model, between principle component and linear discriminant, to be as a classification model. Principle component method was used to reduce all 32 variables to a smaller and manageable set of components. Then, the reduced components were used to develop the Fisher’s Discriminant function. In this research, there are 4 defined classes of rice kernel which are Aromatic, Brown, Ordinary and Others. Based on the output from principle component method, the 32 variables were reduced to only 2 components. Based on the output of classification table from the discriminant analysis, 40.76% from the total observations were correctly classified into their classes by the PC-Discriminant function. Indirectly, it gives an idea that the classification model developed has committed to more than 50% of misclassifying the observations. As a conclusion, the Fisher’s Discriminant function that was built on a 2-component from PCA (2-PC-D) is not satisfying to classify the rice kernels into its defined classes.

Keywords: classification model, discriminant function, principle component analysis, variable reduction

Procedia PDF Downloads 324
586 Study of Lamination Quality of Semi-Flexible Solar Modules with Special Textile Materials

Authors: K. Drabczyk, Z. Starowicz, S. Maleczek, P. Zieba

Abstract:

The army, police and fire brigade commonly use dedicated equipment based on special textile materials. The properties of these textiles should ensure human life and health protection. Equally important is the ability to use electronic equipment and this requires access to the source of electricity. Photovoltaic cells integrated with such textiles can be solution for this problem in the most of outdoor circumstances. One idea may be to laminate the cells to textile without changing their properties. The main goal of this work was analyzed lamination quality of special designed semi-flexible solar module with special textile materials as a backsheet. In the first step of investigation, the quality of lamination was determined using device equipped with dynamometer. In this work, the crystalline silicon solar cells 50 x 50 mm and thin chemical tempered glass - 62 x 62 mm and 0.8 mm thick - were used. The obtained results showed the correlation between breaking force and type of textile weave and fiber. The breaking force was in the ranges: 4.5-5.5 N, 15-20 N and 30-33 N depending on the type of wave and fiber type. To verify these observations the microscopic and FTIR analysis of fibers was performed. The studies showed the special textile can be used as a backsheet of semi-flexible solar modules. This work presents a new composition of solar module with special textile layer which, to our best knowledge, has not been published so far. Moreover, the work presents original investigations on adhesion of EVA (ethylene-vinyl acetate) polymer to textile with respect to fiber structure of laminated substrate. This work is realized for the GEKON project (No. GEKON2/O4/268473/23/2016) sponsored by The National Centre for Research and Development and The National Fund for Environmental Protection and Water Management.

Keywords: flexible solar modules, lamination process, solar cells, textile for photovoltaics

Procedia PDF Downloads 348
585 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation

Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim

Abstract:

In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.

Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement

Procedia PDF Downloads 107
584 An Assessment of Health Hazards in Urban Communities: A Study of Spatial-Temporal Variations of Dengue Epidemic in Colombo, Sri Lanka

Authors: U. Thisara G. Perera, C. M. Kanchana N. K. Chandrasekara

Abstract:

Dengue is an epidemic which is spread by Aedes Egyptai and Aedes Albopictus mosquitoes. The cases of dengue show a dramatic growth rate of the epidemic in urban and semi urban areas spatially in tropical and sub-tropical regions of the world. Incidence of dengue has become a prominent reason for hospitalization and deaths in Asian countries, including Sri Lanka. During the last decade the dengue epidemic began to spread from urban to semi-urban and then to rural settings of the country. The highest number of dengue infected patients was recorded in Sri Lanka in the year 2016 and the highest number of patients was identified in Colombo district. Together with the commercial, industrial, and other supporting services, the district suffers from rapid urbanization and high population density. Thus, drainage and waste disposal patterns of the people in this area exert an additional pressure to the environment. The district is situated in the wet zone and thus low lying lands constitute the largest portion of the district. This situation additionally facilitates mosquito breeding sites. Therefore, the purpose of the present study was to assess the spatial and temporal distribution patterns of dengue epidemic in Kolonnawa MOH area (Medical Officer of Health) in the district of Colombo. The study was carried out using 615 recorded dengue cases in Kollonnawa MOH area during the south east monsoon season from May to September 2016. The Moran’s I and Kernel density estimation were used as analytical methods. The analysis of data was accomplished through the integrated use of ArcGIS 10.1 software packages along with Microsoft Excel analytical tool. Field observation was also carried out for verification purposes during the study period. Results of the Moran’s I index indicates that the spatial distribution of dengue cases showed a cluster distribution pattern across the area. Kernel density estimation emphasis that dengue cases are high where the population has gathered, especially in areas comprising housing schemes. Results of the Kernel Density estimation further discloses that hot spots of dengue epidemic are located in the western half of the Kolonnawa MOH area, which is close to the Colombo municipal boundary and there is a significant relationship with high population density and unplanned urban land use practices. Results of the field observation confirm that the drainage systems in these areas function poorly and careless waste disposal methods of the people further encourage mosquito breeding sites. This situation has evolved harmfully from a public health issue to a social problem, which ultimately impacts on the economy and social lives of the country.

Keywords: Dengue epidemic, health hazards, Kernel density, Moran’s I, Sri Lanka

Procedia PDF Downloads 291
583 A Hierarchical Method for Multi-Class Probabilistic Classification Vector Machines

Authors: P. Byrnes, F. A. DiazDelaO

Abstract:

The Support Vector Machine (SVM) has become widely recognised as one of the leading algorithms in machine learning for both regression and binary classification. It expresses predictions in terms of a linear combination of kernel functions, referred to as support vectors. Despite its popularity amongst practitioners, SVM has some limitations, with the most significant being the generation of point prediction as opposed to predictive distributions. Stemming from this issue, a probabilistic model namely, Probabilistic Classification Vector Machines (PCVM), has been proposed which respects the original functional form of SVM whilst also providing a predictive distribution. As physical system designs become more complex, an increasing number of classification tasks involving industrial applications consist of more than two classes. Consequently, this research proposes a framework which allows for the extension of PCVM to a multi class setting. Additionally, the original PCVM framework relies on the use of type II maximum likelihood to provide estimates for both the kernel hyperparameters and model evidence. In a high dimensional multi class setting, however, this approach has been shown to be ineffective due to bad scaling as the number of classes increases. Accordingly, we propose the application of Markov Chain Monte Carlo (MCMC) based methods to provide a posterior distribution over both parameters and hyperparameters. The proposed framework will be validated against current multi class classifiers through synthetic and real life implementations.

Keywords: probabilistic classification vector machines, multi class classification, MCMC, support vector machines

Procedia PDF Downloads 215
582 3D Plant Growth Measurement System Using Deep Learning Technology

Authors: Kazuaki Shiraishi, Narumitsu Asai, Tsukasa Kitahara, Sosuke Mieno, Takaharu Kameoka

Abstract:

The purpose of this research is to facilitate productivity advances in agriculture. To accomplish this, we developed an automatic three-dimensional (3D) recording system for growth of field crops that consists of a number of inexpensive modules: a very low-cost stereo camera, a couple of ZigBee wireless modules, a Raspberry Pi single-board computer, and a third generation (3G) wireless communication module. Our system uses an inexpensive Web stereo camera in order to keep total costs low. However, inexpensive video cameras record low-resolution images that are very noisy. Accordingly, in order to resolve these problems, we adopted a deep learning method. Based on the results of extended period of time operation test conducted without the use of an external power supply, we found that by using Super-Resolution Convolutional Neural Network method, our system could achieve a balance between the competing goals of low-cost and superior performance. Our experimental results showed the effectiveness of our system.

Keywords: 3D plant data, automatic recording, stereo camera, deep learning, image processing

Procedia PDF Downloads 266
581 SiamMask++: More Accurate Object Tracking through Layer Wise Aggregation in Visual Object Tracking

Authors: Hyunbin Choi, Jihyeon Noh, Changwon Lim

Abstract:

In this paper, we propose SiamMask++, an architecture that performs layer-wise aggregation and depth-wise cross-correlation and introduce multi-RPN module and multi-MASK module to improve EAO (Expected Average Overlap), a representative performance evaluation metric for Visual Object Tracking (VOT) challenge. The proposed architecture, SiamMask++, has two versions, namely, bi_SiamMask++, which satisfies the real time (56fps) on systems equipped with GPUs (Titan XP), and rf_SiamMask++, which combines mask refinement modules for EAO improvements. Tests are performed on VOT2016, VOT2018 and VOT2019, the representative datasets of Visual Object Tracking tasks labeled as rotated bounding boxes. SiamMask++ perform better than SiamMask on all the three datasets tested. SiamMask++ is achieved performance of 62.6% accuracy, 26.2% robustness and 39.8% EAO, especially on the VOT2018 dataset. Compared to SiamMask, this is an improvement of 4.18%, 37.17%, 23.99%, respectively. In addition, we do an experimental in-depth analysis of how much the introduction of features and multi modules extracted from the backbone affects the performance of our model in the VOT task.

Keywords: visual object tracking, video, deep learning, layer wise aggregation, Siamese network

Procedia PDF Downloads 139
580 Understanding the Programming Techniques Using a Complex Case Study to Teach Advanced Object-Oriented Programming

Authors: M. Al-Jepoori, D. Bennett

Abstract:

Teaching Object-Oriented Programming (OOP) as part of a Computing-related university degree is a very difficult task; the road to ensuring that students are actually learning object oriented concepts is unclear, as students often find it difficult to understand the concept of objects and their behavior. This problem is especially obvious in advanced programming modules where Design Pattern and advanced programming features such as Multi-threading and animated GUI are introduced. Looking at the students’ performance at their final year on a university course, it was obvious that the level of students’ understanding of OOP varies to a high degree from one student to another. Students who aim at the production of Games do very well in the advanced programming module. However, the students’ assessment results of the last few years were relatively low; for example, in 2016-2017, the first quartile of marks were as low as 24.5 and the third quartile was 63.5. It is obvious that many students were not confident or competent enough in their programming skills. In this paper, the reasons behind poor performance in Advanced OOP modules are investigated, and a suggested practice for teaching OOP based on a complex case study is described and evaluated.

Keywords: complex programming case study, design pattern, learning advanced programming, object oriented programming

Procedia PDF Downloads 207
579 A Location-Based Search Approach According to Users’ Application Scenario

Authors: Shih-Ting Yang, Chih-Yun Lin, Ming-Yu Li, Jhong-Ting Syue, Wei-Ming Huang

Abstract:

Global positioning system (GPS) has become increasing precise in recent years, and the location-based service (LBS) has developed rapidly. Take the example of finding a parking lot (such as Parking apps). The location-based service can offer immediate information about a nearby parking lot, including the information about remaining parking spaces. However, it cannot provide expected search results according to the requirement situations of users. For that reason, this paper develops a “Location-based Search Approach according to Users’ Application Scenario” according to the location-based search and demand determination to help users obtain the information consistent with their requirements. The “Location-based Search Approach based on Users’ Application Scenario” of this paper consists of one mechanism and three kernel modules. First, in the Information Pre-processing Mechanism (IPM), this paper uses the cosine theorem to categorize the locations of users. Then, in the Information Category Evaluation Module (ICEM), the kNN (k-Nearest Neighbor) is employed to classify the browsing records of users. After that, in the Information Volume Level Determination Module (IVLDM), this paper makes a comparison between the number of users’ clicking the information at different locations and the average number of users’ clicking the information at a specific location, so as to evaluate the urgency of demand; then, the two-dimensional space is used to estimate the application situations of users. For the last step, in the Location-based Search Module (LBSM), this paper compares all search results and the average number of characters of the search results, categorizes the search results with the Manhattan Distance, and selects the results according to the application scenario of users. Additionally, this paper develops a Web-based system according to the methodology to demonstrate practical application of this paper. The application scenario-based estimate and the location-based search are used to evaluate the type and abundance of the information expected by the public at specific location, so that information demanders can obtain the information consistent with their application situations at specific location.

Keywords: data mining, knowledge management, location-based service, user application scenario

Procedia PDF Downloads 112
578 Worldwide Overview of Homologation for Radio Products

Authors: Nekzad R Doctor, Shubham Bhonde, Shashwat Gawande

Abstract:

The homologation, also known as “type approval,” describes primarily the granting of approval by an official authority. For the use and the import of Keys & ID transmitters as well as Body Control Modules with radio transmission around the globe, homologation is necessary. Depending on country requirements or technical properties (e.g., frequency or transmission power), different approaches need to be fulfilled. The requirements could vary in the form of certifications requirement or exemptions, any technologies forbidden, additional legal requirements and type approval for manufacturing locations. This research will give an overview of all different types of approval and technical requirement for worldwide countries.Information is not available for a lot of countries which is challenging for an entrant in the field of homologation. Also, even if the information is available, there could be a language barrier as different countries sometimes upload their regulations in a local language. Also, there is a lot of unclarity in many countries regarding type approval requirements (Safety, EMC certification,2nd factory certification). To have a clear overview and understanding of type approval requirements, in this document, the Worldwide country will be divided into 4 groups based on technology. After which, a region country-specific type approval requirement will be checked in detail. This document will facilitate in providing global Homologation requirements.

Keywords: homologation, type approval, EMC, body control modules

Procedia PDF Downloads 84
577 Working Title: Estimating the Power Output of Photovoltaics in Kuwait Using a Monte Carlo Approach

Authors: Mohammad Alshawaf, Rahmat Poudineh, Nawaf Alhajeri

Abstract:

The power generated from photovoltaic (PV) modules is non-dispatchable on demand due to the stochastic nature of solar radiation. The random variations in the measured intensity of solar irradiance are due to clouds and, in the case of arid regions, dust storms which decrease the intensity of intensity of solar irradiance. Therefore, modeling PV power output using average, maximum, or minimum solar irradiance values is inefficient to predict power generation reliably. The overall objective of this paper is to predict the power output of PV modules using Monte Carlo approach based the weather and solar conditions measured in Kuwait. Given the 250 Wp PV module used in study, the average daily power output is 1021 Wh/day. The maximum power was generated in April and the minimum power was generated in January 1187 Wh/day and 823 Wh/day respectively. The certainty of the daily predictions varies seasonally and according to the weather conditions. The output predictions were far more certain in the summer months, for example, the 80% certainty range for August is 89 Wh/day, whereas the 80% certainty range for April is 250 Wh/day.

Keywords: Monte Carlo, solar energy, variable renewable energy, Kuwait

Procedia PDF Downloads 121
576 Analysis of the Use of a NAO Robot to Improve Social Skills in Children with Autism Spectrum Disorder in Saudi Arabia

Authors: Eman Alarfaj, Hissah Alabdullatif, Huda Alabdullatif, Ghazal Albakri, Nor Shahriza Abdul Karim

Abstract:

Autism Spectrum Disorder is extensively spread amid children; it affects their social, communication and interactive skills. As robotics technology has been proven to be a significant helpful utility those able individuals to overcome their disabilities. Robotic technology is used in ASD therapy. The purpose of this research is to show how Nao robots can improve the social skills for children who suffer from autism in Saudi Arabia by interacting with the autistic child and perform a number of tasks. The objective of this research is to identify, implement, and test the effectiveness of the module for interacting with ASD children in an autism center in Saudi Arabia. The methodology in this study followed the ten layers of protocol that needs to be followed during any human-robot interaction. Also, in order to elicit the scenario module, TEACCH Autism Program was adopted. Six different qualified interaction modules have been elicited and designed in this study; the robot will be programmed to perform these modules in a series of controlled interaction sessions with the Autistic children to enhance their social skills.

Keywords: humanoid robot Nao, ASD, human-robot interaction, social skills

Procedia PDF Downloads 251
575 Identification of Hub Genes in the Development of Atherosclerosis

Authors: Jie Lin, Yiwen Pan, Li Zhang, Zhangyong Xia

Abstract:

Atherosclerosis is a chronic inflammatory disease characterized by the accumulation of lipids, immune cells, and extracellular matrix in the arterial walls. This pathological process can lead to the formation of plaques that can obstruct blood flow and trigger various cardiovascular diseases such as heart attack and stroke. The underlying molecular mechanisms still remain unclear, although many studies revealed the dysfunction of endothelial cells, recruitment and activation of monocytes and macrophages, and the production of pro-inflammatory cytokines and chemokines in atherosclerosis. This study aimed to identify hub genes involved in the progression of atherosclerosis and to analyze their biological function in silico, thereby enhancing our understanding of the disease’s molecular mechanisms. Through the analysis of microarray data, we examined the gene expression in media and neo-intima from plaques, as well as distant macroscopically intact tissue, across a cohort of 32 hypertensive patients. Initially, 112 differentially expressed genes (DEGs) were identified. Subsequent immune infiltration analysis indicated a predominant presence of 27 immune cell types in the atherosclerosis group, particularly noting an increase in monocytes and macrophages. In the Weighted gene co-expression network analysis (WGCNA), 10 modules with a minimum of 30 genes were defined as key modules, with blue, dark, Oliver green and sky-blue modules being the most significant. These modules corresponded respectively to monocyte, activated B cell, and activated CD4 T cell gene patterns, revealing a strong morphological-genetic correlation. From these three gene patterns (modules morphology), a total of 2509 key genes (Gene Significance >0.2, module membership>0.8) were extracted. Six hub genes (CD36, DPP4, HMOX1, PLA2G7, PLN2, and ACADL) were then identified by intersecting 2509 key genes, 102 DEGs with lipid-related genes from the Genecard database. The bio-functional analysis of six hub genes was estimated by a robust classifier with an area under the curve (AUC) of 0.873 in the ROC plot, indicating excellent efficacy in differentiating between the disease and control group. Moreover, PCA visualization demonstrated clear separation between the groups based on these six hub genes, suggesting their potential utility as classification features in predictive models. Protein-protein interaction (PPI) analysis highlighted DPP4 as the most interconnected gene. Within the constructed key gene-drug network, 462 drugs were predicted, with ursodeoxycholic acid (UDCA) being identified as a potential therapeutic agent for modulating DPP4 expression. In summary, our study identified critical hub genes implicated in the progression of atherosclerosis through comprehensive bioinformatic analyses. These findings not only advance our understanding of the disease but also pave the way for applying similar analytical frameworks and predictive models to other diseases, thereby broadening the potential for clinical applications and therapeutic discoveries.

Keywords: atherosclerosis, hub genes, drug prediction, bioinformatics

Procedia PDF Downloads 51
574 Optimal Design Solution in "The Small Module" Within the Possibilities of Ecology, Environmental Science/Engineering, and Economics

Authors: Hassan Wajid

Abstract:

We will commend accommodating an environmentally friendly architectural proposal that is extremely common/usual but whose features will make it a sustainable space. In this experiment, the natural and artificial built space is being proposed in such a way that deals with Environmental, Ecological, and Economic Criteria under different climatic conditions. Moreover, the criteria against ecology-environment-economics reflect in the different modules which are being experimented with and analyzed by multiple research groups. The ecological, environmental, and economic services are provided used as units of production side by side, resulting in local job creation and saving resources, for instance, conservation of rainwater, soil formation or protection, less energy consumption to achieve Net Zero, and a stable climate as a whole. The synthesized results from the collected data suggest several aspects to consider when designing buildings for beginning the design process under the supervision of instructors/directors who are responsible for developing curricula and sustainable goals. Hence, the results of the research and the suggestions will benefit the sustainable design through multiple results, heat analysis of different small modules, and comparisons. As a result, it is depleted as the resources are either consumed or the pollution contaminates the resources.

Keywords: optimization, ecology, environment, sustainable solution

Procedia PDF Downloads 54
573 Experimental and Numerical Analysis of Built-In Thermoelectric Generator Modules with Elliptical Pin-Fin Heat Sink

Authors: J. Y Jang, C. Y. Tseng

Abstract:

A three-dimensional numerical model of thermoelectric generator (TEG) modules attached to a large chimney plate is proposed and solved numerically using a control volume based finite difference formulation. The TEG module consists of a thermoelectric generator, an elliptical pin-fin heat sink, and a cold plate for water cooling. In the chimney, the temperature of flue gases is 450-650K. Therefore, the effects of convection and radiation heat transfer are considered. Although the TEG hot-side temperature and thus the electric power output can be increased by inserting an elliptical pin-fin heat sink into the chimney tunnel to increase the heat transfer area, the pin fin heat sink would cause extra pumping power at the same time. The main purpose of this study is to analyze the effects of geometrical parameters on the electric power output and chimney pressure drop characteristics. In addition, the effects of different operating conditions, including various inlet velocities (Vin = 1, 3, 5 m/s) and inlet temperatures (Tgas = 450, 550, 650K) are discussed in detail. The predicted numerical data for the power vs. current (P-I) curve are in good agreement (within 11%) with the experimental data.

Keywords: thermoelectric generator, waste heat recovery, pin-fin heat sink, experimental and numerical analysis

Procedia PDF Downloads 371
572 Modeling Palm Oil Quality During the Ripening Process of Fresh Fruits

Authors: Afshin Keshvadi, Johari Endan, Haniff Harun, Desa Ahmad, Farah Saleena

Abstract:

Experiments were conducted to develop a model for analyzing the ripening process of oil palm fresh fruits in relation to oil yield and oil quality of palm oil produced. This research was carried out on 8-year-old Tenera (Dura × Pisifera) palms planted in 2003 at the Malaysian Palm Oil Board Research Station. Fresh fruit bunches were harvested from designated palms during January till May of 2010. The bunches were divided into three regions (top, middle and bottom), and fruits from the outer and inner layers were randomly sampled for analysis at 8, 12, 16 and 20 weeks after anthesis to establish relationships between maturity and oil development in the mesocarp and kernel. Computations on data related to ripening time, oil content and oil quality were performed using several computer software programs (MSTAT-C, SAS and Microsoft Excel). Nine nonlinear mathematical models were utilized using MATLAB software to fit the data collected. The results showed mean mesocarp oil percent increased from 1.24 % at 8 weeks after anthesis to 29.6 % at 20 weeks after anthesis. Fruits from the top part of the bunch had the highest mesocarp oil content of 10.09 %. The lowest kernel oil percent of 0.03 % was recorded at 12 weeks after anthesis. Palmitic acid and oleic acid comprised of more than 73 % of total mesocarp fatty acids at 8 weeks after anthesis, and increased to more than 80 % at fruit maturity at 20 weeks. The Logistic model with the highest R2 and the lowest root mean square error was found to be the best fit model.

Keywords: oil palm, oil yield, ripening process, anthesis, fatty acids, modeling

Procedia PDF Downloads 299