Search results for: software vulnerability detection
6820 The Evaluation of Signal Timing Optimization and Implement of Transit Signal Priority in Intersections and Their Effect on Delay Reduction
Authors: Mohammad Reza Ramezani, Shahriyar Afandizadeh
Abstract:
Since the intersections play a crucial role in traffic delay, it is significant to evaluate them precisely. In this paper, three critical intersections in Tehran (Capital of Iran) had been simulated. The main purpose of this paper was to optimize the public transit delay. The simulation had three different phase in three intersections of Tehran. The first phase was about the current condition of intersection; the second phase was about optimized signal timing and the last phase was about prioritized public transit access. The Aimsun software was used to simulate all phases, and the Synchro software was used to optimization of signals as well. The result showed that the implement of optimization and prioritizing system would reduce about 50% of delay for public transit.Keywords: transit signal priority, intersection optimization, public transit, simulation
Procedia PDF Downloads 4716819 A Micro-Scale of Electromechanical System Micro-Sensor Resonator Based on UNO-Microcontroller for Low Magnetic Field Detection
Authors: Waddah Abdelbagi Talha, Mohammed Abdullah Elmaleeh, John Ojur Dennis
Abstract:
This paper focuses on the simulation and implementation of a resonator micro-sensor for low magnetic field sensing based on a U-shaped cantilever and piezoresistive configuration, which works based on Lorentz force physical phenomena. The resonance frequency is an important parameter that depends upon the highest response and sensitivity through the frequency domain (frequency response) of any vibrated micro-scale of an electromechanical system (MEMS) device. And it is important to determine the direction of the detected magnetic field. The deflection of the cantilever is considered for vibrated mode with different frequencies in the range of (0 Hz to 7000 Hz); for the purpose of observing the frequency response. A simple electronic circuit-based polysilicon piezoresistors in Wheatstone's bridge configuration are used to transduce the response of the cantilever to electrical measurements at various voltages. Microcontroller-based Arduino program and PROTEUS electronic software are used to analyze the output signals from the sensor. The highest output voltage amplitude of about 4.7 mV is spotted at about 3 kHz of the frequency domain, indicating the highest sensitivity, which can be called resonant sensitivity. Based on the resonant frequency value, the mode of vibration is determined (up-down vibration), and based on that, the vector of the magnetic field is also determined.Keywords: resonant frequency, sensitivity, Wheatstone bridge, UNO-microcontroller
Procedia PDF Downloads 1256818 QR Technology to Automate Health Condition Detection in Payment System: A Case Study in the Kingdom of Saudi Arabia’s Schools
Authors: Amjad Alsulami, Farah Albishri, Kholod Alzubidi, Lama Almehemadi, Salma Elhag
Abstract:
Food allergy is a common and rising problem among children. Many students have their first allergic reaction at school, one of these is anaphylaxis, which can be fatal. This study discovered that several schools' processes lacked safety regulations and information on how to handle allergy issues and chronic diseases like diabetes where students were not supervised or monitored during the cafeteria purchasing process. There is no obvious prevention or effort in academic institutions when purchasing food containing allergens or negatively impacting the health status of students who suffer from chronic diseases. Students must always be stable to reflect positively on their educational development process. To address this issue, this paper uses a business reengineering process to propose the automation of the whole food-purchasing process, which will aid in detecting and avoiding allergic occurrences and preventing any side effects from eating foods that are conflicting with students' health. This may be achieved by designing a smart card with an embedded QR code that reveals which foods cause an allergic reaction in a student. A survey was distributed to determine and examine how the cafeteria will handle allergic children and whether any management or policy is applied in the school. Also, the survey findings indicate that the integration of QR technology into the food purchasing process would improve health condition detection. The suggested system would be beneficial to all parties, the family agreed, as they would ensure that their children didn't eat foods that were bad for their health. Moreover, by analyzing and simulating the as-is process and the suggested process the results demonstrate that there is an improvement in quality and time.Keywords: QR code, smart card, food allergies, business process reengineering, health condition detection
Procedia PDF Downloads 746817 Fluorescence-Based Biosensor for Dopamine Detection Using Quantum Dots
Authors: Sylwia Krawiec, Joanna Cabaj, Karol Malecha
Abstract:
Nowadays, progress in the field of the analytical methods is of great interest for reliable biological research and medical diagnostics. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements. Chemical sensors have displaced the conventional analytical methods - sensors combine precision, sensitivity, fast response and the possibility of continuous-monitoring. Biosensor is a chemical sensor, which except of conventer also possess a biologically active material, which is the basis for the detection of specific chemicals in the sample. Each biosensor device mainly consists of two elements: a sensitive element, where is recognition of receptor-analyte, and a transducer element which receives the signal and converts it into a measurable signal. Through these two elements biosensors can be divided in two categories: due to the recognition element (e.g immunosensor) and due to the transducer (e.g optical sensor). Working of optical sensor is based on measurements of quantitative changes of parameters characterizing light radiation. The most often analyzed parameters include: amplitude (intensity), frequency or polarization. Changes in the optical properties one of the compound which reacts with biological material coated on the sensor is analyzed by a direct method, in an indirect method indicators are used, which changes the optical properties due to the transformation of the testing species. The most commonly used dyes in this method are: small molecules with an aromatic ring, like rhodamine, fluorescent proteins, for example green fluorescent protein (GFP), or nanoparticles such as quantum dots (QDs). Quantum dots have, in comparison with organic dyes, much better photoluminescent properties, better bioavailability and chemical inertness. These are semiconductor nanocrystals size of 2-10 nm. This very limited number of atoms and the ‘nano’-size gives QDs these highly fluorescent properties. Rapid and sensitive detection of dopamine is extremely important in modern medicine. Dopamine is very important neurotransmitter, which mainly occurs in the brain and central nervous system of mammals. Dopamine is responsible for the transmission information of moving through the nervous system and plays an important role in processes of learning or memory. Detection of dopamine is significant for diseases associated with the central nervous system such as Parkinson or schizophrenia. In developed optical biosensor for detection of dopamine, are used graphene quantum dots (GQDs). In such sensor dopamine molecules coats the GQD surface - in result occurs quenching of fluorescence due to Resonance Energy Transfer (FRET). Changes in fluorescence correspond to specific concentrations of the neurotransmitter in tested sample, so it is possible to accurately determine the concentration of dopamine in the sample.Keywords: biosensor, dopamine, fluorescence, quantum dots
Procedia PDF Downloads 3636816 Molecularly Imprinted Nanoparticles (MIP NPs) as Non-Animal Antibodies Substitutes for Detection of Viruses
Authors: Alessandro Poma, Kal Karim, Sergey Piletsky, Giuseppe Battaglia
Abstract:
The recent increasing emergency threat to public health of infectious influenza diseases has prompted interest in the detection of avian influenza virus (AIV) H5N1 in humans as well as animals. A variety of technologies for diagnosing AIV infection have been developed. However, various disadvantages (costs, lengthy analyses, and need for high-containment facilities) make these methods less than ideal in their practical application. Molecularly Imprinted Polymeric Nanoparticles (MIP NPs) are suitable to overcome these limitations by having high affinity, selectivity, versatility, scalability and cost-effectiveness with the versatility of post-modification (labeling – fluorescent, magnetic, optical) opening the way to the potential introduction of improved diagnostic tests capable of providing rapid differential diagnosis. Here we present our first results in the production and testing of MIP NPs for the detection of AIV H5N1. Recent developments in the solid-phase synthesis of MIP NPs mean that for the first time a reliable supply of ‘soluble’ synthetic antibodies can be made available for testing as potential biological or diagnostic active molecules. The MIP NPs have the potential to detect viruses that are widely circulating in farm animals and indeed humans. Early and accurate identification of the infectious agent will expedite appropriate control measures. Thus, diagnosis at an early stage of infection of a herd or flock or individual maximizes the efficiency with which containment, prevention and possibly treatment strategies can be implemented. More importantly, substantiating the practicability’s of these novel reagents should lead to an initial reduction and eventually to a potential total replacement of animals, both large and small, to raise such specific serological materials.Keywords: influenza virus, molecular imprinting, nanoparticles, polymers
Procedia PDF Downloads 3616815 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data
Authors: S. Jurado, E. Pazmino
Abstract:
Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.Keywords: medial axis, pore-throat distribution, porosity, porous media
Procedia PDF Downloads 1146814 Glycan Analyzer: Software to Annotate Glycan Structures from Exoglycosidase Experiments
Authors: Ian Walsh, Terry Nguyen-Khuong, Christopher H. Taron, Pauline M. Rudd
Abstract:
Glycoproteins and their covalently bonded glycans play critical roles in the immune system, cell communication, disease and disease prognosis. Ultra performance liquid chromatography (UPLC) coupled with mass spectrometry is conventionally used to qualitatively and quantitatively characterise glycan structures in a given sample. Exoglycosidases are enzymes that catalyze sequential removal of monosaccharides from the non-reducing end of glycans. They naturally have specificity for a particular type of sugar, its stereochemistry (α or β anomer) and its position of attachment to an adjacent sugar on the glycan. Thus, monitoring the peak movements (both in the UPLC and MS1) after application of exoglycosidases provides a unique and effective way to annotate sugars with high detail - i.e. differentiating positional and linkage isomers. Manual annotation of an exoglycosidase experiment is difficult and time consuming. As such, with increasing sample complexity and the number of exoglycosidases, the analysis could result in manually interpreting hundreds of peak movements. Recently, we have implemented pattern recognition software for automated interpretation of UPLC-MS1 exoglycosidase digestions. In this work, we explain the software, indicate how much time it will save and provide example usage showing the annotation of positional and linkage isomers in Immunoglobulin G, apolipoprotein J, and simple glycan standards.Keywords: bioinformatics, automated glycan assignment, liquid chromatography, mass spectrometry
Procedia PDF Downloads 1986813 A Simple Colorimetric Assay for Paraquat Detection Using Negatively Charged Silver Nanopaticles
Authors: Weena Siangphro, Orawon Chailapakul, Kriangsak Songsrirote
Abstract:
A simple, rapid, sensitive, and economical method based on colorimetry for the determination of paraquat, a widely used herbicide, was developed. Citrate-coated silver nanoparticles (AgNPs) were synthesized as colorimetric probe. The mechanism of the assay is related to aggregation of negatively charged AgNPs induced by positively-charged paraquat resulting from coulombic attraction which causes the color change from deep greenish yellow to pale yellow upon the concentrations of paraquat. Silica gel was exploited as paraquat adsorbent for purification and pre-concentration prior to the direct determination with negatively charged AgNPs without elution step required. The validity of the proposed approach was evaluated by spiking standard paraquat in water and plant samples. Recoveries of paraquat in water samples were 93.6-95.4%, while those in plant samples were 86.6-89.5% by using the optimized extraction procedure. The absorbance of AgNPs at 400 nm was linearly related to the concentration of paraquat over the range of 0.05-50 mg/L with detection limits of 0.05 ppm for water samples, and 0.10 ppm for plant samples.Keywords: colorimetric assay, paraquat, silica gel, silver nanoparticles
Procedia PDF Downloads 2376812 CT Doses Pre and Post SAFIRE: Sinogram Affirmed Iterative Reconstruction
Authors: N. Noroozian, M. Halim, B. Holloway
Abstract:
Computed Tomography (CT) has become the largest source of radiation exposure in modern countries however, recent technological advances have created new methods to reduce dose without negatively affecting image quality. SAFIRE has emerged as a new software package which utilizes full raw data projections for iterative reconstruction, thereby allowing for lower CT dose to be used. this audit was performed to compare CT doses in certain examinations before and after the introduction of SAFIRE at our Radiology department which showed CT doses were significantly lower using SAFIRE compared with pre-SAFIRE software at SAFIRE 3 setting for the following studies:CSKUH Unenhanced brain scans (-20.9%), CABPEC Abdomen and pelvis with contrast (-21.5%), CCHAPC Chest with contrast (-24.4%), CCHAPC Abdomen and pelvis with contrast (-16.1%), CCHAPC Total chest, abdomen and pelvis (-18.7%).Keywords: dose reduction, iterative reconstruction, low dose CT techniques, SAFIRE
Procedia PDF Downloads 2846811 Using AI to Advance Factory Planning: A Case Study to Identify Success Factors of Implementing an AI-Based Demand Planning Solution
Authors: Ulrike Dowie, Ralph Grothmann
Abstract:
Rational planning decisions are based upon forecasts. Precise forecasting has, therefore, a central role in business. The prediction of customer demand is a prime example. This paper introduces recurrent neural networks to model customer demand and combines the forecast with uncertainty measures to derive decision support of the demand planning department. It identifies and describes the keys to the successful implementation of an AI-based solution: bringing together data with business knowledge, AI methods, and user experience, and applying agile software development practices.Keywords: agile software development, AI project success factors, deep learning, demand forecasting, forecast uncertainty, neural networks, supply chain management
Procedia PDF Downloads 1886810 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS
Authors: Eunsu Jang, Kang Park
Abstract:
In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis
Procedia PDF Downloads 3996809 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings
Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir
Abstract:
Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine
Procedia PDF Downloads 1606808 Determining the City Development Based on the Modeling of the Pollutant Emission from Power Plant by Using AERMOD Software
Authors: Abbasi Fakhrossadat, Moharreri Mohammadamir, Shadmanmahani Mohammadjavad
Abstract:
The development of cities can be influenced by various factors, including air pollution. In this study, the focus is on the city of Mashhad, which has four large power plants operating. The emission of pollutants from these power plants can have a significant impact on the quality of life and health of the city's residents. Therefore, modeling and analyzing the emission pattern of pollutants can provide useful information for urban decision-makers and help in estimating the urban development model. The aim of this research is to determine the direction of city development based on the modeling of pollutant emissions (NOX, CO, and PM10) from power plants in Mashhad. By using the AERMOD software, the release of these pollutants will be modeled and analyzed.Keywords: emission of air pollution, thermal power plant, urban development, AERMOD
Procedia PDF Downloads 776807 Fast and Accurate Model to Detect Ictal Waveforms in Electroencephalogram Signals
Authors: Piyush Swami, Bijaya Ketan Panigrahi, Sneh Anand, Manvir Bhatia, Tapan Gandhi
Abstract:
Visual inspection of electroencephalogram (EEG) signals to detect epileptic signals is very challenging and time-consuming task even for any expert neurophysiologist. This problem is most challenging in under-developed and developing countries due to shortage of skilled neurophysiologists. In the past, notable research efforts have gone in trying to automate the seizure detection process. However, due to high false alarm detections and complexity of the models developed so far, have vastly delimited their practical implementation. In this paper, we present a novel scheme for epileptic seizure detection using empirical mode decomposition technique. The intrinsic mode functions obtained were then used to calculate the standard deviations. This was followed by probability density based classifier to discriminate between non-ictal and ictal patterns in EEG signals. The model presented here demonstrated very high classification rates ( > 97%) without compromising the statistical performance. The computation timings for each testing phase were also very low ( < 0.029 s) which makes this model ideal for practical applications.Keywords: electroencephalogram (EEG), epilepsy, ictal patterns, empirical mode decomposition
Procedia PDF Downloads 4046806 A Probability Analysis of Construction Project Schedule Using Risk Management Tool
Authors: A. L. Agarwal, D. A. Mahajan
Abstract:
Construction industry tumbled along with other industry/sectors during recent economic crash. Construction business could not regain thereafter and still pass through slowdown phase, resulted many real estate as well as infrastructure projects not completed on schedule and within budget. There are many theories, tools, techniques with software packages available in the market to analyze construction schedule. This study focuses on the construction project schedule and uncertainties associated with construction activities. The infrastructure construction project has been considered for the analysis of uncertainty on project activities affecting project duration and analysis is done using @RISK software. Different simulation results arising from three probability distribution functions are compiled to benefit construction project managers to plan more realistic schedule of various construction activities as well as project completion to document in the contract and avoid compensations or claims arising out of missing the planned schedule.Keywords: construction project, distributions, project schedule, uncertainty
Procedia PDF Downloads 3486805 Separating Permanent and Induced Magnetic Signature: A Simple Approach
Authors: O. J. G. Somsen, G. P. M. Wagemakers
Abstract:
Magnetic signature detection provides sensitive detection of metal objects, especially in the natural environment. Our group is developing a tabletop setup for magnetic signatures of various small and model objects. A particular issue is the separation of permanent and induced magnetization. While the latter depends only on the composition and shape of the object, the former also depends on the magnetization history. With common deperming techniques, a significant permanent signature may still remain, which confuses measurements of the induced component. We investigate a basic technique of separating the two. Measurements were done by moving the object along an aluminum rail while the three field components are recorded by a detector attached near the center. This is done first with the rail parallel to the Earth magnetic field and then with anti-parallel orientation. The reversal changes the sign of the induced- but not the permanent magnetization so that the two can be separated. Our preliminary results on a small iron block show excellent reproducibility. A considerable permanent magnetization was indeed present, resulting in a complex asymmetric signature. After separation, a much more symmetric induced signature was obtained that can be studied in detail and compared with theoretical calculations.Keywords: magnetic signature, data analysis, magnetization, deperming techniques
Procedia PDF Downloads 4496804 An Ontology Model for Systems Engineering Derived from ISO/IEC/IEEE 15288: 2015: Systems and Software Engineering - System Life Cycle Processes
Authors: Lan Yang, Kathryn Cormican, Ming Yu
Abstract:
ISO/IEC/IEEE 15288: 2015, Systems and Software Engineering - System Life Cycle Processes is an international standard that provides generic top-level process descriptions to support systems engineering (SE). However, the processes defined in the standard needs improvement to lift integrity and consistency. The goal of this research is to explore the way by building an ontology model for the SE standard to manage the knowledge of SE. The ontology model gives a whole picture of the SE knowledge domain by building connections between SE concepts. Moreover, it creates a hierarchical classification of the concepts to fulfil different requirements of displaying and analysing SE knowledge.Keywords: knowledge management, model-based systems engineering, ontology modelling, systems engineering ontology
Procedia PDF Downloads 4236803 Geospatial Analysis of Spatio-Temporal Dynamic and Environmental Impact of Informal Settlement: A Case of Adama City, Ethiopia
Authors: Zenebu Adere Tola
Abstract:
Informal settlements behave dynamically over space and time and the number of people living in such housing areas is growing worldwide. In the cities of developing countries especially in sub-Saharan Africa, poverty, unemployment rate, poor living condition, lack transparency and accountability, lack of good governance are the major factors to contribute for the people to hold land informally and built houses for residential or other purposes. In most of Ethiopian cities informal settlement is highly seen in peripheral areas this is because people can easily to hold land for housing from local farmers, brokers, speculators without permission from concerning bodies. In Adama informal settlement has created risky living conditions and led to environmental problems in natural areas the main reason for this was the lack of sufficient knowledge about informal settlement development. On the other side there is a strong need to transform informal into formal settlements and to gain more control about the actual spatial development of informal settlements. In another hand to tackle the issue it is at least very important to understand the scale of the problem. To understand the scale of the problem it is important to use up-to-date technology. For this specific problem, it is good to use high-resolution imagery to detect informal settlement in Adama city. The main objective of this study is to assess the spatiotemporal dynamics and environmental impacts of informal settlement using OBIA. Specifically, the objective of this study is to; identify informal settlement in the study area, determine the change in the extent and pattern of informal settlement and to assess the environmental and social impacts of informal settlement in the study area. The methods to be used to detect the informal settlement is object-oriented image analysis. Consequently, reliable procedures for detecting the spatial behavior of informal settlements are required in order to react at an early stage to changing housing situations. Thus, obtaining spatial information about informal settlement areas which is up to date is vital for any actions of enhancement in terms of urban or regional planning. Using data for this study aerial photography for growth and change of informal settlements in Adama city. Software ECognition software for classy to built-up and non-built areas. Thus, obtaining spatial information about informal settlement areas which is up to date is vital for any actions of enhancement in terms of urban or regional planning.Keywords: informal settlement, change detection, environmental impact, object based analysis
Procedia PDF Downloads 836802 Impact of Internal Control on Fraud Detection and Prevention: A Survey of Selected Organisations in Nigeria
Authors: Amos Olusola Akinola
Abstract:
The aim of this study is to evaluate the internal control system on fraud prevention in Nigerian business organizations. A survey research was undertaken in five organizations from the banking and manufacturing sectors in Nigeria using the simple random sampling technique and primary data was obtained with the aid structured questionnaire drawn on five likert’s scale. Four Hypotheses were formulated and tested using the T-test Statistics, Correlation and Regression Analysis at 95% confidence interval. It was discovered that internal control has a significant positive relationship with fraud prevention and that a weak internal control system permits fraudulent activities among staff. Based on the findings, it was recommended that organizations should continually and methodically review and evaluate the components of its internal control system whether activities are working as planned or not and that every organization should have pre-determined guidelines for conducting its operations and ensures compliance with these set guidelines while proactive steps should be taken to establish the independence of the internal audit by making the audit reportable to the governing council of an organization and not the chief executive officer.Keywords: internal control, internal system, internal audit, fraud prevention, fraud detection
Procedia PDF Downloads 3836801 Predicting Daily Patient Hospital Visits Using Machine Learning
Authors: Shreya Goyal
Abstract:
The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.Keywords: machine learning, SVM, HIPAA, data
Procedia PDF Downloads 656800 Facility Anomaly Detection with Gaussian Mixture Model
Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho
Abstract:
Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm
Procedia PDF Downloads 2706799 Mitigating the Vulnerability of Subsistence Farmers through Ground Water Optimisation
Authors: Olayemi Bakre
Abstract:
The majoritant of the South African rural populace are directly or indirectly engaged in agricultural practices for a livelihood. However, impediments such as the climate change and inadequacy of governmental support has undermined the once thriving subsistence farming communities of South Africa. Furthermore, the poor leadership in hydrology, coupled with lack of depths in skills to facilitate the understanding and acceptance of groundwater from national level to local governance has made it near impossible for subsistence farmers to optimally benefit from the groundwater beneath their feet. The 2012 drought experienced in South Africa paralysed the farming activities across several subsistence farming communities across the KwaZulu-Natal Province. To revamp subsistence farming, a variety of interventions and strategies such as the Resource Poor Farmers (RPF) and Water Allocation Reforms (WAR) have been launched by the Department of Water and Sanitation (DWS) as an agendum to galvanising the defunct subsistence farming communities of KwaZulu-Natal as well as other subsistence farming communities across South Africa. Despite the enormous resources expended on the subsistence farming communities whom often fall under the Historically Disadvantaged Individuals (HDI); indicators such as the unsustainable farming practices, poor crop yield, pitiable living condition as well as the poor standard of living, are evidential to the claim that these afore cited interventions and a host of other similar strategies indicates that these initiatives have not yield the desired result. Thus, this paper seeks to suggest practicable interventions aimed at salvaging the vulnerability of subsistence farmers within the province understudy. The study pursued a qualitative approach as the view of experts on ground water and similarly related fields from the DWS were solicited as an agendum to obtaining in-depth perspective into the current study. Some of the core challenges undermining the sustainability and growth of subsistence farming in the area of study were - inadequacy of experts (engineers, scientist, researchers) in ground water; water shortages; lack of political will as well as lack of coordination among stakeholders. As an agendum to optimising the ground water usage for subsistence farming, this paper advocates the strengthening of geohydrological skills, development of technical training capacity, interactive participation among stakeholders as well as the initiation of Participatory Action Research as an agenda to optimising the available ground water in KwaZulu-Natal which is intended to orchestrate a sustainable and viable subsistence farming practice within the province.Keywords: subsistence farming, ground water optimisation, resource poor farmers, and water allocation reforms, hydrology
Procedia PDF Downloads 2456798 Classification Framework of Production Planning and Scheduling Solutions from Supply Chain Management Perspective
Authors: Kwan Hee Han
Abstract:
In today’s business environments, frequent change of customer requirements is a tough challenge to manufacturing company. To cope with these challenges, a production planning and scheduling (PP&S) function might be established to provide accountability for both customer service and operational efficiency. Nowadays, many manufacturing firms have utilized PP&S software solutions to generate a realistic production plan and schedule to adapt to external changes efficiently. However, companies which consider the introduction of PP&S software solution, still have difficulties for selecting adequate solution to meet their specific needs. Since the task of PP&S is the one of major building blocks of SCM (Supply Chain Management) architecture, which deals with short term decision making in the production process of SCM, it is needed that the functionalities of PP&S should be analysed within the whole SCM process. The aim of this paper is to analyse the PP&S functionalities and its system architecture from the SCM perspective by using the criteria of level of planning hierarchy, major 4 SCM processes and problem-solving approaches, and finally propose a classification framework of PP&S solutions to facilitate the comparison among various commercial software solutions. By using proposed framework, several major PP&S solutions are classified and positioned according to their functional characteristics in this paper. By using this framework, practitioners who consider the introduction of computerized PP&S solutions in manufacturing firms can prepare evaluation and benchmarking sheets for selecting the most suitable solution with ease and in less time.Keywords: production planning, production scheduling, supply chain management, the advanced planning system
Procedia PDF Downloads 1966797 Applications for Accounting of Inherited Object-Oriented Class Members
Authors: Jehad Al Dallal
Abstract:
A class in an Object-Oriented (OO) system is the basic unit of design, and it encapsulates a set of attributes and methods. In OO systems, instead of redefining the attributes and methods that are included in other classes, a class can inherit these attributes and methods and only implement its unique attributes and methods, which results in reducing code redundancy and improving code testability and maintainability. Such mechanism is called Class Inheritance. However, some software engineering applications may require accounting for all the inherited class members (i.e., attributes and methods). This paper explains how to account for inherited class members and discusses the software engineering applications that require such consideration.Keywords: class flattening, external quality attribute, inheritance, internal quality attribute, object-oriented design
Procedia PDF Downloads 2686796 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks
Authors: Ather Saeed, Arif Khan, Jeffrey Gosper
Abstract:
Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering
Procedia PDF Downloads 746795 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine
Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang
Abstract:
Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing
Procedia PDF Downloads 2036794 Biomechanical Study of a Type II Superior Labral Anterior to Posterior Lesion in the Glenohumeral Joint Using Finite Element Analysis
Authors: Javier A. Maldonado E., Duvert A. Puentes T., Diego F. Villegas B.
Abstract:
The SLAP lesion (Superior Labral Anterior to Posterior) involves the labrum, causing pain and mobility problems in the glenohumeral joint. This injury is common in athletes practicing sports that requires throwing or those who receive traumatic impacts on the shoulder area. This paper determines the biomechanical behavior of soft tissues of the glenohumeral joint when type II SLAP lesion is present. This pathology is characterized for a tear in the superior labrum which is simulated in a 3D model of the shoulder joint. A 3D model of the glenohumeral joint was obtained using the free software Slice. Then, a Finite Element analysis was done using a general purpose software which simulates a compression test with external rotation. First, a validation was done assuming a healthy joint shoulder with a previous study. Once the initial model was validated, a lesion of the labrum built using a CAD software and the same test was done again. The results obtained were stress and strain distribution of the synovial capsule and the injured labrum. ANOVA was done for the healthy and injured glenohumeral joint finding significant differences between them. This study will help orthopedic surgeons to know the biomechanics involving this type of lesion and also the other surrounding structures affected by loading the injured joint.Keywords: biomechanics, computational model, finite elements, glenohumeral joint, superior labral anterior to posterior lesion
Procedia PDF Downloads 2066793 Realistic Modeling of the Preclinical Small Animal Using Commercial Software
Authors: Su Chul Han, Seungwoo Park
Abstract:
As the increasing incidence of cancer, the technology and modality of radiotherapy have advanced and the importance of preclinical model is increasing in the cancer research. Furthermore, the small animal dosimetry is an essential part of the evaluation of the relationship between the absorbed dose in preclinical small animal and biological effect in preclinical study. In this study, we carried out realistic modeling of the preclinical small animal phantom possible to verify irradiated dose using commercial software. The small animal phantom was modeling from 4D Digital Mouse whole body phantom. To manipulate Moby phantom in commercial software (Mimics, Materialise, Leuven, Belgium), we converted Moby phantom to DICOM image file of CT by Matlab and two- dimensional of CT images were converted to the three-dimensional image and it is possible to segment and crop CT image in Sagittal, Coronal and axial view). The CT images of small animals were modeling following process. Based on the profile line value, the thresholding was carried out to make a mask that was connection of all the regions of the equal threshold range. Using thresholding method, we segmented into three part (bone, body (tissue). lung), to separate neighboring pixels between lung and body (tissue), we used region growing function of Mimics software. We acquired 3D object by 3D calculation in the segmented images. The generated 3D object was smoothing by remeshing operation and smoothing operation factor was 0.4, iteration value was 5. The edge mode was selected to perform triangle reduction. The parameters were that tolerance (0.1mm), edge angle (15 degrees) and the number of iteration (5). The image processing 3D object file was converted to an STL file to output with 3D printer. We modified 3D small animal file using 3- Matic research (Materialise, Leuven, Belgium) to make space for radiation dosimetry chips. We acquired 3D object of realistic small animal phantom. The width of small animal phantom was 2.631 cm, thickness was 2.361 cm, and length was 10.817. Mimics software supported efficiency about 3D object generation and usability of conversion to STL file for user. The development of small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.Keywords: mimics, preclinical small animal, segmentation, 3D printer
Procedia PDF Downloads 3656792 Ambivilance, Denial, and Adaptive Responses to Vulnerable Suspects in Police Custody: The New Limits of the Sovereign State
Authors: Faye Cosgrove, Donna Peacock
Abstract:
This paper examines current state strategies for dealing with vulnerable people in police custody and identifies the underpinning discourses and practices which inform these strategies. It has previously been argued that the state has utilised contradictory and conflicting responses to the control of crime, by employing opposing strategies of denial and adaptation in order to simultaneously both display sovereignty and disclaim responsibility. This paper argues that these contradictory strategies are still being employed in contemporary criminal justice, although the focus and the purpose have now shifted. The focus is upon the ‘vulnerable’ suspect, whose social identity is as incongruous, complex and contradictory as his social environment, and the purpose is to redirect attention away from negative state practices, whilst simultaneously displaying a compassionate and benevolent countenance in order to appeal to the voting public. The findings presented here result from intensive qualitative research with police officers, with health care professionals, and with civilian volunteers who work within police custodial environments. The data has been gathered over a three-year period and includes observational and interview data which has been thematically analysed to expose the underpinning mechanisms from which the properties of the system emerge. What is revealed is evidence of contemporary state practices of denial relating to the harms of austerity and the structural relations of vulnerability, whilst simultaneously adapting through processes of ‘othering’ of the vulnerable, ‘responsibilisation’ of citizens, defining deviance down through diversionary practices, and managing success through redefining the aims of the system. The ‘vulnerable’ suspect is subject to individual pathologising, and yet the nature of risk is aggregated. ‘Vulnerable’ suspects are supported in police custody by private citizens, by multi-agency partnerships, and by for-profit organisations, while the state seeks to collate and control services, and thereby to retain a veneer of control. Late modern ambivalence to crime control and the associated contradictory practices of abjuration and adjustment have extended to state responses to vulnerable suspects. The support available in the custody environment operates to control and minimise operational and procedural risk, rather than for the welfare of the detained person, and in fact, the support available is discovered to be detrimental to the very people that it claims to benefit. The ‘vulnerable’ suspect is now subject to the bifurcated logics employed at the new limits of the sovereign state.Keywords: custody, policing, sovereign state, vulnerability
Procedia PDF Downloads 1686791 Information Technology in Assessing Risks and Threats in the Transition of the Brand to the Digital Environment
Authors: Spanova Yerkezhan, Amantay Ayan, Alimzhanova Laura
Abstract:
This article discusses the concept of rebranding and its relationship to cybersecurity. Rebranding is the process of changing the appearance and image of a company or organization in order to appeal to new customers or change the perception of a company. It can be a powerful tool for businesses looking to renew their reputation or expand into new markets. In today's digital age, companies increasingly rely on technology and the internet to conduct business; rebranding can also present significant cybersecurity risks. This is because a rebranding effort can create new vulnerabilities for companies, particularly in terms of their online presence. This article explores the potential hazards associated with rebranding and provides recommendations for mitigating those risks. It also highlights the importance of considering cybersecurity in the rebranding process and how it can be integrated into the overall strategy for a successful and secure rebranding.Keywords: rebranding, cybersecurity, cyberattack, logo, vulnerability
Procedia PDF Downloads 165