Search results for: panel data method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38580

Search results for: panel data method

36660 A Dynamic Panel Model to Evaluate the Impact of Debt Relief on Poverty

Authors: Loujaina Abdelwahed

Abstract:

Debt relief granted to low-and middle-income countries effectively provides additional funds for governments that can be used to increase public investment on poverty-reducing services to alleviate poverty and boost economic growth. However, little is known about the extent to which the poor benefit from the increased public investment. This study aims to assess the impact of debt relief granted through multiple initiatives during the 1990s on poverty reduction. In particular, it assesses the impact on the level, depth and severity of poverty in 76 low-and middle income countries over the period 1990-2011. Debt relief is found to have a significant impact on reducing the level, the depth and the severity of poverty. Analysis of the different types of debt relief reveals that debt service relief reduces poverty, whereas debt principle relief does not have a significant impact.

Keywords: debt relief, developing countries, HIPC, poverty, system GMM estimator

Procedia PDF Downloads 398
36659 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 355
36658 Analysis of Municipal Solid Waste Management in Nigeria

Authors: Anisa Gumel

Abstract:

This study examines the present condition of solid waste management in Nigeria. The author explores the challenges and opportunities affecting municipal solid waste management in "Nigeria" and determines the most profound challenges by analysing the interdependence and interrelationship among identified variables. In this study, multiple stakeholders, including 15 waste management professionals interviewed online, were utilised to identify the difficulties and opportunities affecting municipal solid waste in Nigeria. The interviews were transcribed and coded using NVivo to produce pertinent variables. An online survey of Nigerian internet and social media users was done to validate statements made by experts on the identified variable. In addition, a panel of five experts participated in a focus group discussion to discover the most influential factors that influence municipal solid waste management in Nigeria by analysing the interrelationships as well as the driving and reliant power of variables. The results show significant factors affecting municipal solid waste in Nigeria, including inadequate funding, lack of knowledge, and absence of legislation, as well as behavioural, financial, technological, and legal concerns grouped into five categories. Some claims stated by experts in the interview are supported by the survey data, while others are not. In addition, the focus group reveals patterns, correlations, and driving forces between variables that have been analysed. This study will provide decision-makers with a roadmap for resolving important waste management concerns in Nigeria and managing scarce resources effectively. It will also help non-governmental organisations combat malaria in Nigeria and other underdeveloped nations. In addition, the work contributes to the literature for future scholars to consult.

Keywords: municipal solid waste, stakeholders, public, experts

Procedia PDF Downloads 80
36657 The Effect of CPU Location in Total Immersion of Microelectronics

Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson

Abstract:

Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.

Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures

Procedia PDF Downloads 272
36656 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby

Authors: Jazim Sohail, Filipe Teixeira-Dias

Abstract:

Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.

Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI

Procedia PDF Downloads 217
36655 Trajectory Tracking of a Redundant Hybrid Manipulator Using a Switching Control Method

Authors: Atilla Bayram

Abstract:

This paper presents the trajectory tracking control of a spatial redundant hybrid manipulator. This manipulator consists of two parallel manipulators which are a variable geometry truss (VGT) module. In fact, each VGT module with 3-degress of freedom (DOF) is a planar parallel manipulator and their operational planes of these VGT modules are arranged to be orthogonal to each other. Also, the manipulator contains a twist motion part attached to the top of the second VGT module to supply the missing orientation of the endeffector. These three modules constitute totally 7-DOF hybrid (parallel-parallel) redundant spatial manipulator. The forward kinematics equations of this manipulator are obtained, then, according to these equations, the inverse kinematics is solved based on an optimization with the joint limit avoidance. The dynamic equations are formed by using virtual work method. In order to test the performance of the redundant manipulator and the controllers presented, two different desired trajectories are followed by using the computed force control method and a switching control method. The switching control method is combined with the computed force control method and genetic algorithm. In the switching control method, the genetic algorithm is only used for fine tuning in the compensation of the trajectory tracking errors.

Keywords: computed force method, genetic algorithm, hybrid manipulator, inverse kinematics of redundant manipulators, variable geometry truss

Procedia PDF Downloads 347
36654 Predictive Modelling of Aircraft Component Replacement Using Imbalanced Learning and Ensemble Method

Authors: Dangut Maren David, Skaf Zakwan

Abstract:

Adequate monitoring of vehicle component in other to obtain high uptime is the goal of predictive maintenance, the major challenge faced by businesses in industries is the significant cost associated with a delay in service delivery due to system downtime. Most of those businesses are interested in predicting those problems and proactively prevent them in advance before it occurs, which is the core advantage of Prognostic Health Management (PHM) application. The recent emergence of industry 4.0 or industrial internet of things (IIoT) has led to the need for monitoring systems activities and enhancing system-to-system or component-to- component interactions, this has resulted to a large generation of data known as big data. Analysis of big data represents an increasingly important, however, due to complexity inherently in the dataset such as imbalance classification problems, it becomes extremely difficult to build a model with accurate high precision. Data-driven predictive modeling for condition-based maintenance (CBM) has recently drowned research interest with growing attention to both academics and industries. The large data generated from industrial process inherently comes with a different degree of complexity which posed a challenge for analytics. Thus, imbalance classification problem exists perversely in industrial datasets which can affect the performance of learning algorithms yielding to poor classifier accuracy in model development. Misclassification of faults can result in unplanned breakdown leading economic loss. In this paper, an advanced approach for handling imbalance classification problem is proposed and then a prognostic model for predicting aircraft component replacement is developed to predict component replacement in advanced by exploring aircraft historical data, the approached is based on hybrid ensemble-based method which improves the prediction of the minority class during learning, we also investigate the impact of our approach on multiclass imbalance problem. We validate the feasibility and effectiveness in terms of the performance of our approach using real-world aircraft operation and maintenance datasets, which spans over 7 years. Our approach shows better performance compared to other similar approaches. We also validate our approach strength for handling multiclass imbalanced dataset, our results also show good performance compared to other based classifiers.

Keywords: prognostics, data-driven, imbalance classification, deep learning

Procedia PDF Downloads 174
36653 Nursing Documentation of Patients' Information at Selected Primary Health Care Facilities in Limpopo Province, South Africa: Implications for Professional Practice

Authors: Maria Sonto Maputle, Rhulani C. Shihundla, Rachel T. Lebese

Abstract:

Background: Patients’ information must be complete and accurately documented in order to foster quality and continuity of care. The multidisciplinary health care members use patients’ documentation to communicate about health status, preventive health services, treatment, planning and delivery of care. The purpose of this study was to determine the practice of nursing documentation of patients’ information at selected Primary Health Care (PHC) facilities in Vhembe District, Limpopo Province, South Africa. Methods: The research approach adopted was qualitative while exploratory and descriptive design was used. The study was conducted at selected PHC facilities. Population included twelve professional nurses. Non-probability purposive sampling method was used to sample professional nurses who were willing to participate in the study. The criteria included participants’ whose daily work and activities, involved creating, keeping and updating nursing documentation of patients’ information. Qualitative data collection was through unstructured in-depth interviews until no new information emerged. Data were analysed through open–coding of, Tesch’s eight steps method. Results: Following data analysis, it was found that professional nurses’ had knowledge deficit related to insufficient training on updates and rendering multiple services daily had negative impact on accurate documentation of patients’ information. Conclusion: The study recommended standardization of registers, books and forms used at PHC facilities, and reorganization of PHC services into open day system.

Keywords: documentation, knowledge, patient care, patient’s information, training

Procedia PDF Downloads 190
36652 Visco-Acoustic Full Wave Inversion in the Frequency Domain with Mixed Grids

Authors: Sheryl Avendaño, Miguel Ospina, Hebert Montegranario

Abstract:

Full Wave Inversion (FWI) is a variant of seismic tomography for obtaining velocity profiles by an optimization process that combine forward modelling (or solution of wave equation) with the misfit between synthetic and observed data. In this research we are modelling wave propagation in a visco-acoustic medium in the frequency domain. We apply finite differences for the numerical solution of the wave equation with a mix between usual and rotated grids, where density depends on velocity and there exists a damping function associated to a linear dissipative medium. The velocity profiles are obtained from an initial one and the data have been modeled for a frequency range 0-120 Hz. By an iterative procedure we obtain an estimated velocity profile in which are detailed the remarkable features of the velocity profile from which synthetic data were generated showing promising results for our method.

Keywords: seismic inversion, full wave inversion, visco acoustic wave equation, finite diffrence methods

Procedia PDF Downloads 461
36651 Non-Invasive Imaging of Tissue Using Near Infrared Radiations

Authors: Ashwani Kumar Aggarwal

Abstract:

NIR Light is non-ionizing and can pass easily through living tissues such as breast without any harmful effects. Therefore, use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function. This blurred reconstructed image has been enhanced using a digital filter which is optimal in mean square sense.

Keywords: least-squares optimization, filtering, tomography, laser interaction, light scattering

Procedia PDF Downloads 316
36650 Petra: Simplified, Scalable Verification Using an Object-Oriented, Compositional Process Calculus

Authors: Aran Hakki, Corina Cirstea, Julian Rathke

Abstract:

Formal methods are yet to be utilized in mainstream software development due to issues in scaling and implementation costs. This work is about developing a scalable, simplified, pragmatic, formal software development method with strong correctness properties and guarantees that are easy prove. The method aims to be easy to learn, use and apply without extensive training and experience in formal methods. Petra is proposed as an object-oriented, process calculus with composable data types and sequential/parallel processes. Petra has a simple denotational semantics, which includes a definition of Correct by Construction. The aim is for Petra is to be standard which can be implemented to execute on various mainstream programming platforms such as Java. Work towards an implementation of Petra as a Java EDSL (Embedded Domain Specific Language) is also discussed.

Keywords: compositionality, formal method, software verification, Java, denotational semantics, rewriting systems, rewriting semantics, parallel processing, object-oriented programming, OOP, programming language, correct by construction

Procedia PDF Downloads 145
36649 The Development of the Website Learning the Local Wisdom in Phra Nakhon Si Ayutthaya Province

Authors: Bunthida Chunngam, Thanyanan Worasesthaphong

Abstract:

This research had objective to develop of the website learning the local wisdom in Phra Nakhon Si Ayutthaya province and studied satisfaction of system user. This research sample was multistage sample for 100 questionnaires, analyzed data to calculated reliability value with Cronbach’s alpha coefficient method α=0.82. This system had 3 functions which were system using, system feather evaluation and system accuracy evaluation which the statistics used for data analysis was descriptive statistics to explain sample feature so these statistics were frequency, percentage, mean and standard deviation. This data analysis result found that the system using performance quality had good level satisfaction (4.44 mean), system feather function analysis had good level satisfaction (4.11 mean) and system accuracy had good level satisfaction (3.74 mean).

Keywords: website, learning, local wisdom, Phra Nakhon Si Ayutthaya province

Procedia PDF Downloads 120
36648 Solutions of Fuzzy Transportation Problem Using Best Candidates Method and Different Ranking Techniques

Authors: M. S. Annie Christi

Abstract:

Transportation Problem (TP) is based on supply and demand of commodities transported from one source to the different destinations. Usual methods for finding solution of TPs are North-West Corner Rule, Least Cost Method Vogel’s Approximation Method etc. The transportation costs tend to vary at each time. We can use fuzzy numbers which would give solution according to this situation. In this study the Best Candidate Method (BCM) is applied. For ranking Centroid Ranking Technique (CRT) and Robust Ranking Technique have been adopted to transform the fuzzy TP and the above methods are applied to EDWARDS Vacuum Company, Crawley, in West Sussex in the United Kingdom. A Comparative study is also given. We see that the transportation cost can be minimized by the application of CRT under BCM.

Keywords: best candidate method, centroid ranking technique, fuzzy transportation problem, robust ranking technique, transportation problem

Procedia PDF Downloads 294
36647 Terrestrial Laser Scans to Assess Aerial LiDAR Data

Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani

Abstract:

The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.

Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy

Procedia PDF Downloads 100
36646 Use the Null Space to Create Starting Point for Stochastic Programming

Authors: Ghussoun Al-Jeiroudi

Abstract:

Stochastic programming is one of the powerful technique which is used to solve real-life problems. Hence, the data of real-life problems is subject to significant uncertainty. Uncertainty is well studied and modeled by stochastic programming. Each day, problems become bigger and bigger and the need for a tool, which does deal with large scale problems, increase. Interior point method is a perfect tool to solve such problems. Interior point method is widely employed to solve the programs, which arise from stochastic programming. It is an iterative technique, so it is required a starting point. Well design starting point plays an important role in improving the convergence speed. In this paper, we propose a starting point for interior point method for multistage stochastic programming. Usually, the optimal solution of stage k+1 is used as starting point for the stage k. This point has the advantage of being close to the solution of the current program. However, it has a disadvantage; it is not in the feasible region of the current program. So, we suggest to take this point and modifying it. That is by adding to it a vector in the null space of the matrix of the unchanged constraints because the solution will change only in the null space of this matrix.

Keywords: interior point methods, stochastic programming, null space, starting points

Procedia PDF Downloads 418
36645 Numerical Simulation of High Strength Steel Hot-Finished Elliptical Hollow Section Subjected to Uniaxial Eccentric Compression

Authors: Zhengyi Kong, Xueqing Wang, Quang-Viet Vu

Abstract:

In this study, the structural behavior of high strength steel (HSS) hot-finished elliptical hollow section (EHS) subjected to uniaxial eccentric compression is investigated. A finite element method for predicting the cross-section resistance of HSS hot-finished EHS is developed using ABAQUS software, which is then verified by comparison with previous experiments. The validated finite element method is employed to carry out parametric studies for investigating the structural behavior of HSS hot-finished EHS under uniaxial eccentric compression and evaluate the current design guidance for HSS hot-finished EHS. Different parameters, such as the radius of the larger and smaller outer diameter of EHS, thickness of EHS, eccentricity, and material property, are considered. The resulting data from 84 finite element models are used to obtain the relationship between the cross-section resistance of HSS hot-finished EHS and cross-section slenderness. It is concluded that current design provisions, such as EN 1993-1-1, BS 5950-1, AS4100, and Gardner et al., are conservative for predicting the HSS hot-finished EHS under uniaxial eccentric compression.

Keywords: hot-finished, elliptical hollow section, uniaxial eccentric compression, finite element method

Procedia PDF Downloads 138
36644 Efficient Antenna Array Beamforming with Robustness against Random Steering Mismatch

Authors: Ju-Hong Lee, Ching-Wei Liao, Kun-Che Lee

Abstract:

This paper deals with the problem of using antenna sensors for adaptive beamforming in the presence of random steering mismatch. We present an efficient adaptive array beamformer with robustness to deal with the considered problem. The robustness of the proposed beamformer comes from the efficient designation of the steering vector. Using the received array data vector, we construct an appropriate correlation matrix associated with the received array data vector and a correlation matrix associated with signal sources. Then, the eigenvector associated with the largest eigenvalue of the constructed signal correlation matrix is designated as an appropriate estimate of the steering vector. Finally, the adaptive weight vector required for adaptive beamforming is obtained by using the estimated steering vector and the constructed correlation matrix of the array data vector. Simulation results confirm the effectiveness of the proposed method.

Keywords: adaptive beamforming, antenna array, linearly constrained minimum variance, robustness, steering vector

Procedia PDF Downloads 199
36643 Behaviour of Reinforced Concrete Infilled Frames under Seismic Loads

Authors: W. Badla

Abstract:

A significant portion of the buildings constructed in Algeria is structural frames with infill panels which are usually considered as non structural components and are neglected in the analysis. However, these masonry panels tend to influence the structural response. Thus, these structures can be regarded as seismic risk buildings, although in the Algerian seismic code there is little guidance on the seismic evaluation of infilled frame buildings. In this study, three RC frames with 2, 4, and 8 story and subjected to three recorded Algerian accelerograms are studied. The diagonal strut approach is adopted for modeling the infill panels and a fiber model is used to model RC members. This paper reports on the seismic evaluation of RC frames with brick infill panels. The results obtained show that the masonry panels enhance the load lateral capacity of the buildings and the infill panel configuration influences the response of the structures.

Keywords: seismic design, RC frames, infill panels, non linear dynamic analysis

Procedia PDF Downloads 546
36642 Influence of Socio-Economic Factors on Crime Perpetuation Among Inmates of Correctional Facilities in South-West Nigeria

Authors: Ebenezer Bayode Agboola

Abstract:

The study investigated the influence of socioeconomic factors on crime perpetuation among inmates of correctional facilities in South West Nigeria. A sample size of two hundred and forty-four inmates was drawn from Ado, Akure and Ilesha correctional facilities. The sample size consisted of both male and female inmates. Individual inmate was drawn through systematic sampling with the use of inmates’ register at the correctional facilities. The study employed a mixed design, which allowed the blend of both quantitative and qualitative methods. For the quantitative method, data was collected through the use of a questionnaire and for the qualitative method; data was collected with the aid of an in-depth interview (ID. Four research questions were raised for the study and analysed descriptively using simple frequency count and percentage. Five research hypotheses were formulated for the study and tested using Analysis of Variance (ANOVA) and Multiple Regressions. Based on the data analysis, findings revealed that there was a significant relationship between family history and perpetuation of crime among inmates. Though no significant relationship was found between employment and the perpetuation of crime, however, the rate of crime perpetuation by individuals was significantly found to be related to peer pressure. Also, the study further found that there was a significant relationship between the use of substances and perpetuation of crime. Lastly, it was found that there was a significant relationship between family history, employment, and peer pressure. The study recommended that Parents should pay adequate attention to their children, especially during the adolescent stage and that the Government should enact relevant laws that will checkmate the rising involvement of young people in cybercrime or internet fraud.

Keywords: crime, socio economic factor, inmates, correctional facilities, Southwest

Procedia PDF Downloads 88
36641 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles

Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang

Abstract:

With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.

Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering

Procedia PDF Downloads 128
36640 Terrorist Financing through Ilegal Fintech Hacking: Case Study of Rizki Gunawan

Authors: Ishna Indika Jusi, Rifana Meika

Abstract:

Terrorism financing method in Indonesia is developing at an alarming rate, to the point, it is now becoming more complex than before. Terrorists traditionally use conventional methods like robberies, charities, and courier services to fund their activities; today terrorists are able to utilize modern methods in financing their activities due to the rapid development in financial technology nowadays; one example is by hacking an illegal Fintech Company. Therefore, this research is conducted in order to explain and analyze the consideration behind the usage of an illegal fintech company to finance terrorism activities and how to prevent it. The analysis in this research is done by using the theory that is coined by Michael Freeman about the reasoning of terrorists when choosing their financing method. The method used in this research is a case study, and the case that is used for this research is the terrorism financing hacking of speedline.com in 2011 by Rizki Gunawan. Research data are acquired from interviews with the perpetrators, experts from INTRAC (PPATK), Special Detachment 88, reports, and journals that are relevant to the research. As a result, this study found that the priority aspects in terms of terrorist financing are security, quantity, and simplicity while obtaining funds.

Keywords: Fintech, illegal, Indonesia, technology, terrorism financing

Procedia PDF Downloads 170
36639 An Analytical Method for Solving General Riccati Equation

Authors: Y. Pala, M. O. Ertas

Abstract:

In this paper, the general Riccati equation is analytically solved by a new transformation. By the method developed, looking at the transformed equation, whether or not an explicit solution can be obtained is readily determined. Since the present method does not require a proper solution for the general solution, it is especially suitable for equations whose proper solutions cannot be seen at first glance. Since the transformed second order linear equation obtained by the present transformation has the simplest form that it can have, it is immediately seen whether or not the original equation can be solved analytically. The present method is exemplified by several examples.

Keywords: Riccati equation, analytical solution, proper solution, nonlinear

Procedia PDF Downloads 354
36638 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date

Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian

Abstract:

To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.

Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven

Procedia PDF Downloads 174
36637 The Guaranteed Detection of the Seismoacoustic Emission Source in the C-OTDR Systems

Authors: Andrey V. Timofeev

Abstract:

A method is proposed for stable detection of seismoacoustic sources in C-OTDR systems that guarantee given upper bounds for probabilities of type I and type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDR-system are presented in this.

Keywords: guaranteed detection, C-OTDR systems, change point, interval estimation

Procedia PDF Downloads 256
36636 Improved Image Retrieval for Efficient Localization in Urban Areas Using Location Uncertainty Data

Authors: Mahdi Salarian, Xi Xu, Rashid Ansari

Abstract:

Accurate localization of mobile devices based on camera-acquired visual media information usually requires a search over a very large GPS-referenced image database. This paper proposes an efficient method for limiting the search space for image retrieval engine by extracting and leveraging additional media information about Estimated Positional Error (EP E) to address complexity and accuracy issues in the search, especially to be used for compensating GPS location inaccuracy in dense urban areas. The improved performance is achieved by up to a hundred-fold reduction in the search area used in available reference methods while providing improved accuracy. To test our procedure we created a database by acquiring Google Street View (GSV) images for down town of Chicago. Other available databases are not suitable for our approach due to lack of EP E for the query images. We tested the procedure using more than 200 query images along with EP E acquired mostly in the densest areas of Chicago with different phones and in different conditions such as low illumination and from under rail tracks. The effectiveness of our approach and the effect of size and sector angle of the search area are discussed and experimental results demonstrate how our proposed method can improve performance just by utilizing a data that is available for mobile systems such as smart phones.

Keywords: localization, retrieval, GPS uncertainty, bag of word

Procedia PDF Downloads 283
36635 Spiritual Health in View of Islamic Mysticism

Authors: Maryam Bakhtyar-Pegah Akrami

Abstract:

The relationship between spiritual health and spirituality is one of the important is that in recent years, the research about it is expanding and due to the rich heritage of the in this field of study and research in this important field more than before and we will come to spiritual life and healthier than before. In this research, we provide the following and the basics of Islamic Mysticism in the realm of spiritual health thoughts. This research is based on a descriptive method and comparison with analytical-method to data collected. The findings show that human beings due to this pivotal topic of full Islamic slab, and mental and physical education with the sought to reach the human place are complete, we can provide the basics along with new discussions of spiritual health help human beings to spiritual education along with our faiths in the reconstruction of the spiritual foundations of spiritual health are extremely helpful

Keywords: spirituality, health, Islam, mysticism, perfect human

Procedia PDF Downloads 161
36634 Sensory Evaluation of Meat from Broilers Bird Fed Detoxified Jatropher Curcas and that Fed Conventional Feed

Authors: W. S. Lawal, T. A. Akande

Abstract:

Four (4) different methods were employed to detoxified jatropha caucas, they are physical method (if include soaking and drying) chemical method (use of methylated spirit, hexane and methene) biological method,(use of Aspergillus niger and Sunday for 7 days and then baccillus lichifarming) and finally combined method (combination of all these methods). Phobol esther andysis was carried out after the detoxification and was found that combined method is better off (P>0.05). 100 broiler birds was used to further test the effect of detoxified Jatropha by combined method, 50 birds for Jatropha made feed at 10 birds per treatment and was replicated five times, this was also repeated for another 50 birds fed conventional feed, Jatropha made feed was compranded at 8% inclusion level. At the end of the 8th weeks, 8 birds were sacrificed each from each treatment and one bird each was fry, roast, boil and grilled from both conventional and Jatropha fed birds and panelist were served for evaluation. It was found that feeding Jatropha to poultry birds has no effect on the taste of the meat.

Keywords: phobol esther, inclusion level, tolerance level, Jatropha carcass

Procedia PDF Downloads 425
36633 Nonlinear Heat Transfer in a Spiral Fin with a Period Base Temperature

Authors: Kuo-Teng Tsai, You-Min Huang

Abstract:

In this study, the problem of a spiral fin with a period base temperature is analyzed by using the Adomian decomposition method. The Adomian decomposition method is a useful and practice method to solve the nonlinear energy equation which are associated with the heat radiation. The period base temperature is around a mean value. The results including the temperature distribution and the heat flux from the spiral fin base can be calculated directly. The results also discussed the effects of the dimensionless variables for the temperature variations and the total energy transferred from the spiral fin base.

Keywords: spiral fin, period, adomian decomposition method, nonlinear

Procedia PDF Downloads 527
36632 Energy Efficiency Factors in Toll Plazas

Authors: S. Balubaid, M. Z. Abd Majid, R. Zakaria

Abstract:

Energy efficiency is one of the most important issues for green buildings and their sustainability. This is not only due to the environmental impacts, but also because of significantly high energy cost. The aim of this study is to identify the potential actions required for toll plaza that lead to energy reduction. The data were obtained through set of questionnaire and interviewing targeted respondents, including the employees at toll plaza, and architects and engineers who are directly involved in design of highway projects. The data was analyzed using descriptive statistics analysis method. The findings of this study are the critical elements that influence the energy usage and factors that lead to energy wastage. Finally, potential actions are recommended to reduce energy consumption in toll plazas.

Keywords: energy efficiency, toll plaza, energy consumption

Procedia PDF Downloads 547
36631 Designing Price Stability Model of Red Cayenne Pepper Price in Wonogiri District, Centre Java, Using ARCH/GARCH Method

Authors: Fauzia Dianawati, Riska W. Purnomo

Abstract:

Food and agricultural sector become the biggest sector contributing to inflation in Indonesia. Especially in Wonogiri district, red cayenne pepper was the biggest sector contributing to inflation on 2016. A national statistic proved that in recent five years red cayenne pepper has the highest average level of fluctuation among all commodities. Some factors, like supply chain, price disparity, production quantity, crop failure, and oil price become the possible factor causes high volatility level in red cayenne pepper price. Therefore, this research tries to find the key factor causing fluctuation on red cayenne pepper by using ARCH/GARCH method. The method could accommodate the presence of heteroscedasticity in time series data. At the end of the research, it is statistically found that the second level of supply chain becomes the biggest part contributing to inflation with 3,35 of coefficient in fluctuation forecasting model of red cayenne pepper price. This model could become a reference to the government to determine the appropriate policy in maintaining the price stability of red cayenne pepper.

Keywords: ARCH/GARCH, forecasting, red cayenne pepper, volatility, supply chain

Procedia PDF Downloads 186