Search results for: automated guided vehicle
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2892

Search results for: automated guided vehicle

2082 Optical Flow Based System for Cross Traffic Alert

Authors: Giuseppe Spampinato, Salvatore Curti, Ivana Guarneri, Arcangelo Bruna

Abstract:

This document describes an advanced system and methodology for Cross Traffic Alert (CTA), able to detect vehicles that move into the vehicle driving path from the left or right side. The camera is supposed to be not only on a vehicle still, e.g. at a traffic light or at an intersection, but also moving slowly, e.g. in a car park. In all of the aforementioned conditions, a driver’s short loss of concentration or distraction can easily lead to a serious accident. A valid support to avoid these kinds of car crashes is represented by the proposed system. It is an extension of our previous work, related to a clustering system, which only works on fixed cameras. Just a vanish point calculation and simple optical flow filtering, to eliminate motion vectors due to the car relative movement, is performed to let the system achieve high performances with different scenarios, cameras and resolutions. The proposed system just uses as input the optical flow, which is hardware implemented in the proposed platform and since the elaboration of the whole system is really speed and power consumption, it is inserted directly in the camera framework, allowing to execute all the processing in real-time.

Keywords: clustering, cross traffic alert, optical flow, real time, vanishing point

Procedia PDF Downloads 193
2081 A Philosophical Investigation into African Conceptions of Personhood in the Fourth Industrial Revolution

Authors: Sanelisiwe Ndlovu

Abstract:

Cities have become testbeds for automation and experimenting with artificial intelligence (AI) in managing urban services and public spaces. Smart Cities and AI systems are changing most human experiences from health and education to personal relations. For instance, in healthcare, social robots are being implemented as tools to assist patients. Similarly, in education, social robots are being used as tutors or co-learners to promote cognitive and affective outcomes. With that general picture in mind, one can now ask a further question about Smart Cities and artificial agents and their moral standing in the African context of personhood. There has been a wealth of literature on the topic of personhood; however, there is an absence of literature on African personhood in highly automated environments. Personhood in African philosophy is defined by the role one can and should play in the community. However, in today’s technologically advanced world, a risk is that machines become more capable of accomplishing tasks that humans would otherwise do. Further, on many African communitarian accounts, personhood and moral standing are associated with active relationality with the community. However, in the Smart City, human closeness is gradually diminishing. For instance, humans already do engage and identify with robotic entities, sometimes even romantically. The primary aim of this study is to investigate how African conceptions of personhood and community interact in a highly automated environment such as Smart Cities. Accordingly, this study lies in presenting a rarely discussed African perspective that emphasizes the necessity and the importance of relationality in handling Smart Cities and AI ethically. Thus, the proposed approach can be seen as the sub-Saharan African contribution to personhood and the growing AI debates, which takes the reality of the interconnectedness of society seriously. And it will also open up new opportunities to tackle old problems and use existing resources to confront new problems in the Fourth Industrial Revolution.

Keywords: smart city, artificial intelligence, personhood, community

Procedia PDF Downloads 199
2080 Obstacle Classification Method Based on 2D LIDAR Database

Authors: Moohyun Lee, Soojung Hur, Yongwan Park

Abstract:

In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.

Keywords: obstacle, classification, database, LIDAR, segmentation, intensity

Procedia PDF Downloads 340
2079 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores

Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan

Abstract:

Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.

Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics

Procedia PDF Downloads 125
2078 Artificial Neural Network Based Approach for Estimation of Individual Vehicle Speed under Mixed Traffic Condition

Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh

Abstract:

Developing speed model is a challenging task particularly under mixed traffic condition where the traffic composition plays a significant role in determining vehicular speed. The present research has been conducted to model individual vehicular speed in the context of mixed traffic on an urban arterial. Traffic speed and volume data have been collected from three midblock arterial road sections in New Delhi. Using the field data, a volume based speed prediction model has been developed adopting the methodology of Artificial Neural Network (ANN). The model developed in this work is capable of estimating speed for individual vehicle category. Validation results show a great deal of agreement between the observed speeds and the predicted values by the model developed. Also, it has been observed that the ANN based model performs better compared to other existing models in terms of accuracy. Finally, the sensitivity analysis has been performed utilizing the model in order to examine the effects of traffic volume and its composition on individual speeds.

Keywords: speed model, artificial neural network, arterial, mixed traffic

Procedia PDF Downloads 382
2077 Optimization Approach to Integrated Production-Inventory-Routing Problem for Oxygen Supply Chains

Authors: Yena Lee, Vassilis M. Charitopoulos, Karthik Thyagarajan, Ian Morris, Jose M. Pinto, Lazaros G. Papageorgiou

Abstract:

With globalisation, the need to have better coordination of production and distribution decisions has become increasingly important for industrial gas companies in order to remain competitive in the marketplace. In this work, we investigate a problem that integrates production, inventory, and routing decisions in a liquid oxygen supply chain. The oxygen supply chain consists of production facilities, external third-party suppliers, and multiple customers, including hospitals and industrial customers. The product produced by the plants or sourced from the competitors, i.e., third-party suppliers, is distributed by a fleet of heterogenous vehicles to satisfy customer demands. The objective is to minimise the total operating cost involving production, third-party, and transportation costs. The key decisions for production include production and inventory levels and product amount from third-party suppliers. In contrast, the distribution decisions involve customer allocation, delivery timing, delivery amount, and vehicle routing. The optimisation of the coordinated production, inventory, and routing decisions is a challenging problem, especially when dealing with large-size problems. Thus, we present a two-stage procedure to solve the integrated problem efficiently. First, the problem is formulated as a mixed-integer linear programming (MILP) model by simplifying the routing component. The solution from the first-stage MILP model yields the optimal customer allocation, production and inventory levels, and delivery timing and amount. Then, we fix the previous decisions and solve a detailed routing. In the second stage, we propose a column generation scheme to address the computational complexity of the resulting detailed routing problem. A case study considering a real-life oxygen supply chain in the UK is presented to illustrate the capability of the proposed models and solution method. Furthermore, a comparison of the solutions from the proposed approach with the corresponding solutions provided by existing metaheuristic techniques (e.g., guided local search and tabu search algorithms) is presented to evaluate the efficiency.

Keywords: production planning, inventory routing, column generation, mixed-integer linear programming

Procedia PDF Downloads 108
2076 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods

Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino

Abstract:

In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.

Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer

Procedia PDF Downloads 340
2075 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors

Authors: Susana Aragoneses Garrido

Abstract:

Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.

Keywords: analysis vehicles, asssesment, ergonomics, car redesign

Procedia PDF Downloads 330
2074 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 226
2073 Formal Asymptotic Stability Guarantees, Analysis, and Evaluation of Nonlinear Controlled Unmanned Aerial Vehicle for Trajectory Tracking

Authors: Soheib Fergani

Abstract:

This paper concerns with the formal asymptotic stability guarantees, analysis and evaluation of a nonlinear controlled unmanned aerial vehicles (uav) for trajectory tracking purpose. As the system has been recognised as an under-actuated non linear system, the control strategy has been oriented towards a hierarchical control. The dynamics of the system and the mission purpose make it mandatory to provide an absolute proof of the vehicle stability during the maneuvers. For this sake, this work establishes the complete theoretical proof for an implementable control oriented strategy that asymptotically stabilizes (GAS and LISS) the system and has never been provided in previous works. The considered model is reorganized into two partly decoupled sub-systems. The concidered control strategy is presented into two stages: the first sub-system is controlled by a nonlinear backstepping controller that generates the desired control inputs to stabilize the second sub-system. This methodology is then applied to a harware in the loop uav simulator (SiMoDrones) that reproduces the realistic behaviour of the uav in an indoor environment has been performed to show the efficiency of the proposed strategy.

Keywords: UAV application, trajectory tracking, backstepping, sliding mode control, input to state stability, stability evaluation

Procedia PDF Downloads 59
2072 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 100
2071 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process

Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani

Abstract:

Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.

Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process

Procedia PDF Downloads 331
2070 AI-based Radio Resource and Transmission Opportunity Allocation for 5G-V2X HetNets: NR and NR-U Networks

Authors: Farshad Zeinali, Sajedeh Norouzi, Nader Mokari, Eduard Jorswieck

Abstract:

The capacity of fifth-generation (5G) vehicle-to-everything (V2X) networks poses significant challenges. To ad- dress this challenge, this paper utilizes New Radio (NR) and New Radio Unlicensed (NR-U) networks to develop a heterogeneous vehicular network (HetNet). We propose a new framework, named joint BS assignment and resource allocation (JBSRA) for mobile V2X users and also consider coexistence schemes based on flexible duty cycle (DC) mechanism for unlicensed bands. Our objective is to maximize the average throughput of vehicles while guaranteeing the WiFi users' throughput. In simulations based on deep reinforcement learning (DRL) algorithms such as deep deterministic policy gradient (DDPG) and deep Q network (DQN), our proposed framework outperforms existing solutions that rely on fixed DC or schemes without consideration of unlicensed bands.

Keywords: vehicle-to-everything (V2X), resource allocation, BS assignment, new radio (NR), new radio unlicensed (NR-U), coexistence NR-U and WiFi, deep deterministic policy gradient (DDPG), deep Q-network (DQN), joint BS assignment and resource allocation (JBSRA), duty cycle mechanism

Procedia PDF Downloads 96
2069 Eosinopenia: Marker for Early Diagnosis of Enteric Fever

Authors: Swati Kapoor, Rajeev Upreti, Monica Mahajan, Abhaya Indrayan, Dinesh Srivastava

Abstract:

Enteric Fever is caused by gram negative bacilli Salmonella typhi and paratyphi. It is associated with high morbidity and mortality worldwide. Timely initiation of treatment is a crucial step for prevention of any complications. Cultures of body fluids are diagnostic, but not always conclusive or practically feasible in most centers. Moreover, the results of cultures delay the treatment initiation. Serological tests lack diagnostic value. The blood counts can offer a promising option in diagnosis. A retrospective study to find out the relevance of leucopenia and eosinopenia was conducted on 203 culture proven enteric fever patients and 159 culture proven non-enteric fever patients in a tertiary care hospital in New Delhi. The patient details were retrieved from the electronic medical records section of the hospital. Absolute eosinopenia was considered as absolute eosinophil count (AEC) of less than 40/mm³ (normal level: 40-400/mm³) using LH-750 Beckman Coulter Automated machine. Leucopoenia was defined as total leucocyte count (TLC) of less than 4 X 10⁹/l. Blood cultures were done using BacT/ALERT FA plus automated blood culture system before first antibiotic dose was given. Case and control groups were compared using Pearson Chi square test. It was observed that absolute eosinophil count (AEC) of 0-19/mm³ was a significant finding (p < 0.001) in enteric fever patients, whereas leucopenia was not a significant finding (p=0.096). Using Receiving Operating Characteristic (ROC) curves, it was observed that patients with both AEC < 14/mm³ and TCL < 8 x 10⁹/l had 95.6% chance of being diagnosed as enteric fever and only 4.4% chance of being diagnosed as non-enteric fever. This result was highly significant with p < 0.001. This is a very useful association of AEC and TLC found in enteric fever patients of this study which can be used for the early initiation of treatment in clinically suspected enteric fever patients.

Keywords: absolute eosinopenia, absolute eosinophil count, enteric fever, leucopenia, total leucocyte count

Procedia PDF Downloads 169
2068 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine

Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang

Abstract:

Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.

Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing

Procedia PDF Downloads 197
2067 Geospatial Network Analysis Using Particle Swarm Optimization

Authors: Varun Singh, Mainak Bandyopadhyay, Maharana Pratap Singh

Abstract:

The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.

Keywords: particle swarm optimization, GIS, traffic data, outliers

Procedia PDF Downloads 476
2066 Aspects Regarding the Structural Behaviour of Autonomous Underwater Vehicle for Emergency Response

Authors: Lucian Stefanita Grigore, Damian Gorgoteanu, Cristian Molder, Amado Stefan, Daniel Constantin

Abstract:

The purpose of this article is to present an analytical-numerical study on the structural behavior of a sunken autonomous underwater vehicle (AUV) for emergency intervention. The need for such a study was generated by the key objective of the ERL-Emergency project. The project aims to develop a system of collaborative robots for emergency response. The system consists of two robots: unmanned ground vehicles (UGV) on tracks and the second is an AUV. The system of collaborative robots, AUV and UGV, will be used to perform missions of monitoring, intervention, and rescue. The main mission of the AUV is to dive into the maritime space of an industrial port to detect possible leaks in a pipeline transporting petroleum products. Another mission is to close and open the valves with which the pipes are provided. Finally, you will need to be able to lift a manikin to the surface, which you can take to land. Numerical analysis was performed by the finite element method (FEM). The conditions for immersing the AUV at 100 m depth were simulated, and the calculations for different fluid flow rates were repeated. From a structural point of view, the stiffening areas and the enclosures in which the command-and-control elements and the accumulators are located have been especially analyzed. The conclusion of this research is that the AUV meets very well the established requirements.

Keywords: analytical-numerical, emergency, FEM, robotics, underwater

Procedia PDF Downloads 146
2065 Design and Development of an Innovative MR Damper Based on Intelligent Active Suspension Control of a Malaysia's Model Vehicle

Authors: L. Wei Sheng, M. T. Noor Syazwanee, C. J. Carolyna, M. Amiruddin, M. Pauziah

Abstract:

This paper exhibits the alternatives towards active suspension systems revised based on the classical passive suspension system to improve comfort and handling performance. An active Magneto rheological (MR) suspension system is proposed as to explore the active based suspension system to enhance performance given its freedom to independently specify the characteristics of load carrying, handling, and ride quality. Malaysian quarter car with two degrees of freedom (2DOF) system is designed and constructed to simulate the actions of an active vehicle suspension system. The structure of a conventional twin-tube shock absorber is modified both internally and externally to comprehend with the active suspension system. The shock absorber peripheral structure is altered to enable the assembling and disassembling of the damper through a non-permanent joint whereby the stress analysis of the designed joint is simulated using Finite Element Analysis. Simulation on the internal part where an electrified copper coil of 24AWG is winded is done using Finite Element Method Magnetics to measure the magnetic flux density inside the MR damper. The primary purpose of this approach is to reduce the vibration transmitted from the effects of road surface irregularities while maintaining solid manoeuvrability. The aim of this research is to develop an intelligent control system of a consecutive damping automotive suspension system. The ride quality is improved by means of the reduction of the vertical body acceleration caused by the car body when it experiences disturbances from speed bump and random road roughness. Findings from this research are expected to enhance the quality of ride which in return can prevent the deteriorating effect of vibration on the vehicle condition as well as the passengers’ well-being.

Keywords: active suspension, FEA, magneto rheological damper, Malaysian quarter car model, vibration control

Procedia PDF Downloads 207
2064 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 221
2063 Response of Pavement under Temperature and Vehicle Coupled Loading

Authors: Yang Zhong, Mei-Jie Xu

Abstract:

To study the dynamic mechanics response of asphalt pavement under the temperature load and vehicle loading, asphalt pavement was regarded as multilayered elastic half-space system, and theory analysis was conducted by regarding dynamic modulus of asphalt mixture as the parameter. Firstly, based on the dynamic modulus test of asphalt mixture, function relationship between the dynamic modulus of representative asphalt mixture and temperature was obtained. In addition, the analytical solution for thermal stress in the single layer was derived by using Laplace integral transformation and Hankel integral transformation respectively by using thermal equations of equilibrium. The analytical solution of calculation model of thermal stress in asphalt pavement was derived by transfer matrix of thermal stress in multilayer elastic system. Finally, the variation of thermal stress in pavement structure was analyzed. The result shows that there is an obvious difference between the thermal stress based on dynamic modulus and the solution based on static modulus. Therefore, the dynamic change of parameter in asphalt mixture should be taken into consideration when the theoretical analysis is taken out.

Keywords: asphalt pavement, dynamic modulus, integral transformation, transfer matrix, thermal stress

Procedia PDF Downloads 495
2062 Urban Traffic: Understanding the Traffic Flow Factor Through Fluid Dynamics

Authors: Sathish Kumar Jayaraj

Abstract:

The study of urban traffic dynamics, underpinned by the principles of fluid dynamics, offers a distinct perspective to comprehend and enhance the efficiency of traffic flow within bustling cityscapes. Leveraging the concept of the Traffic Flow Factor (TFF) as an analog to the Reynolds number, this research delves into the intricate interplay between traffic density, velocity, and road category, drawing compelling parallels to fluid dynamics phenomena. By introducing the notion of Vehicle Shearing Resistance (VSR) as an analogy to dynamic viscosity, the study sheds light on the multifaceted influence of traffic regulations, lane management, and road infrastructure on the smoothness and resilience of traffic flow. The TFF equation serves as a comprehensive metric for quantifying traffic dynamics, enabling the identification of congestion hotspots, the optimization of traffic signal timings, and the formulation of data-driven traffic management strategies. The study underscores the critical significance of integrating fluid dynamics principles into the domain of urban traffic management, fostering sustainable transportation practices, and paving the way for a more seamless and resilient urban mobility ecosystem.

Keywords: traffic flow factor (TFF), urban traffic dynamics, fluid dynamics principles, vehicle shearing resistance (VSR), traffic congestion management, sustainable urban mobility

Procedia PDF Downloads 55
2061 Innovative Screening Tool Based on Physical Properties of Blood

Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan

Abstract:

This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.

Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability

Procedia PDF Downloads 372
2060 A Qualitative Study of Approaches Used by Physiotherapists to Educate Patients with Chronic Low Back Pain

Authors: Styliani Soulioti, Helen Fiddler

Abstract:

The aim of this study was to investigate the approaches used by physiotherapists in the education of patients with chronic low back pain (cLBP) and the rationale that underpins their choice of approach. Therapeutic patient education (TPE) is considered to be an important aspect of modern physiotherapy practice, as it helps patients achieve better self-management and a better understanding of their problem. Previous studies have explored this subject, but the reasoning behind the choices physiotherapists make as educators has not been widely explored, thus making it difficult to understand areas that could be addressed in order to improve the application of TPE.A qualitative study design, guided by a constructivist epistemology was used in this research project. Semi-structured interviews were used to collect data from 7 physiotherapists. Inductive coding and thematic analysis were used, which allowed key themes to emerge. Data analysis revealed two overarching themes: 1) patient-centred versus therapist-centred educational approaches, and 2) behaviourist versus constructivist educational approaches. Physiotherapists appear to use a patient-centred-approach when they explore patients’ beliefs about cLBP and treatment expectations. However, treatment planning and goal-setting were guided by a therapist-centred approach, as physiotherapists appear to take on the role of the instructor/expert, whereas patients were viewed as students. Using a constructivist approach, physiotherapists aimed to provide guidance to patients by combining their professional knowledge with the patients’ individual knowledge, to help the patient better understand their problem, reflect upon it and find a possible solution. However, educating patients about scientific facts concerning cLBP followed a behaviourist approach, as an instructor/student relationship was observed and the learning content was predetermined and transmitted in a one-way manner. The results of this study suggest that a lack of consistency appears to exist in the educational approaches used by physiotherapists. Although patient-centeredness and constructivism appear to be the aims set by physiotherapists in order to optimise the education they provide, a student-teacher relationship appears to dominate when it comes to goal-setting and delivering scientific information.

Keywords: chronic low back pain, educational approaches, health education, patient education

Procedia PDF Downloads 202
2059 Effect of Rainflow Cycle Number on Fatigue Lifetime of an Arm of Vehicle Suspension System

Authors: Hatem Mrad, Mohamed Bouazara, Fouad Erchiqui

Abstract:

Fatigue, is considered as one of the main cause of mechanical properties degradation of mechanical parts. Probability and reliability methods are appropriate for fatigue analysis using uncertainties that exist in fatigue material or process parameters. Current work deals with the study of the effect of the number and counting Rainflow cycle on fatigue lifetime (cumulative damage) of an upper arm of the vehicle suspension system. The major part of the fatigue damage induced in suspension arm is caused by two main classes of parameters. The first is related to the materials properties and the second is the road excitation or the applied force of the passenger’s number. Therefore, Young's modulus and road excitation are selected as input parameters to conduct repetitive simulations by Monte Carlo (MC) algorithm. Latin hypercube sampling method is used to generate these parameters. Response surface method is established according to fatigue lifetime of each combination of input parameters according to strain-life method. A PYTHON script was developed to automatize finite element simulations of the upper arm according to a design of experiments.

Keywords: fatigue, monte carlo, rainflow cycle, response surface, suspension system

Procedia PDF Downloads 251
2058 Evaluation to Assess the Impact of Newcastle Infant Partnership Approach

Authors: Samantha Burns, Melissa Brown, Judith Rankin

Abstract:

Background: As a specialised intervention, NEWPIP provides a service which supports both parents and their babies from conception to two years, who are experiencing issues which may affect the quality of their relationship and development of the infant. This evaluation of the NEWPIP approach was undertaken in response to the need for rich, in-depth data to understand the lived experiences of the parents who experienced the service to improve the service. NEWPIP is currently one of 34 specialised parent–infant relationship teams across England. This evaluation contributes to increasing understanding of the impact and effectiveness of this specialised service to inform future practice. Aim: The aim of this evaluation was to explore the perspectives and experiences of parents or caregivers (service users), to assess the impact of the NEWPIP service on the parents themselves and the relationship with their baby. Methods: The exploratory nature of the aim and focus on service users’ experience and perspectives provided scope for a qualitative approach for this evaluation. This consisted of 10 semi-structured interviews with parents who had received the service within the last two years. Recruitment involved both purposive and convenience sampling. The interviews took place between February 2021 – March 2021, lasting between 30-90 minutes and were guided by open-ended questions from a topic guide. The interviews adopted a narrative approach to enable the parents to share their lived experiences. The researchers transcribed the interviews and analysed the data thematically by using a coding method which is grounded in the data. Results: The analysis and findings from the data gathered illuminated an approach which supports parents to build a better bond with their baby and provides a safe space for parents to heal through their relationships. While the parents shared their experiences, the interviews were intended to receive feedback, so questions were asked about what could be improved and what recommendations could be offered to Children North East. Guided by the voice of the parents, this evaluation provides recommendations to support the future of the NEWPIP approach. Conclusions: The NEWPIP approach appears to successfully provide early and flexible support for new parents, increasing a parent’s confidence in their ability to not only cope but thrive as a new parent.

Keywords: maternal health, mental health, parent infant relationship, therapy

Procedia PDF Downloads 189
2057 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game

Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin

Abstract:

Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.

Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design

Procedia PDF Downloads 415
2056 PLO-AIM: Potential-Based Lane Organization in Autonomous Intersection Management

Authors: Berk Ecer, Ebru Akcapinar Sezer

Abstract:

Traditional management models of intersections, such as no-light intersections or signalized intersection, are not the most effective way of passing the intersections if the vehicles are intelligent. To this end, Dresner and Stone proposed a new intersection control model called Autonomous Intersection Management (AIM). In the AIM simulation, they were examining the problem from a multi-agent perspective, demonstrating that intelligent intersection control can be made more efficient than existing control mechanisms. In this study, autonomous intersection management has been investigated. We extended their works and added a potential-based lane organization layer. In order to distribute vehicles evenly to each lane, this layer triggers vehicles to analyze near lanes, and they change their lane if other lanes have an advantage. We can observe this behavior in real life, such as drivers, change their lane by considering their intuitions. Basic intuition on selecting the correct lane for traffic is selecting a less crowded lane in order to reduce delay. We model that behavior without any change in the AIM workflow. Experiment results show us that intersection performance is directly connected with the vehicle distribution in lanes of roads of intersections. We see the advantage of handling lane management with a potential approach in performance metrics such as average delay of intersection and average travel time. Therefore, lane management and intersection management are problems that need to be handled together. This study shows us that the lane through which vehicles enter the intersection is an effective parameter for intersection management. Our study draws attention to this parameter and suggested a solution for it. We observed that the regulation of AIM inputs, which are vehicles in lanes, was as effective as contributing to aim intersection management. PLO-AIM model outperforms AIM in evaluation metrics such as average delay of intersection and average travel time for reasonable traffic rates, which is in between 600 vehicle/hour per lane to 1300 vehicle/hour per lane. The proposed model reduced the average travel time reduced in between %0.2 - %17.3 and reduced the average delay of intersection in between %1.6 - %17.1 for 4-lane and 6-lane scenarios.

Keywords: AIM project, autonomous intersection management, lane organization, potential-based approach

Procedia PDF Downloads 133
2055 Formal Innovations vs. Informal Innovations: The Case of the Mining Sector in Nigeria

Authors: Jegede Oluseye Oladayo

Abstract:

The study mapped innovation activities in the formal and informal mining sector in Nigeria. Data were collected through primary and secondary sources. Primary data were collected through guided questionnaire administration, guided interviews and personal observation. A purposive sampling method was adopted to select firms that are micro, small and medium enterprises. The study covered 100 (50 in the formal sector and 50 in the informal sector) purposively selected companies in south-western Nigeria. Secondary data were collected from different published sources. Data were analysed using descriptive and inferential statistics. Of the four types of technological innovations sampled, organisational innovation was found to be highest both in the formal (100%) and informal (100%) sectors, followed by process innovation: 60% in the formal sector and 28% in the informal sector, marketing innovation and diffusion based innovation were implemented by 64% and 4% respectively in the formal sector. There were no R&D activities (intramural or extramural) in both sectors, however, innovation activities occur at moderate levels in the formal sector. This is characterised by acquisition of machinery, equipment, hardware (100%), software (56), training (82%) and acquisition of external knowledge (60%) in the formal sector. In the informal sector, innovation activities were characterised by acquisition of external knowledge (100%), training/learning by experience (100%) and acquisition of tools (68%). The impact of innovation on firm’s performance in the formal sector was expressed mainly as increased capacity of production (100%), reduced production cost per unit of labour (88%), compliance with governmental regulatory requirements (72%) and entry on new markets (60%). In the informal sector, the impact of innovation was mainly expressed in improved flexibility of production (70%) and machinery/energy efficiency (70%). The important technological driver of process innovation in the mining sector was acquisition of machinery which accounts for the prevalence of 100% both in the formal and informal sectors. Next to this is training and re-training of technical staff, 74% in both the formal and the informal sector. Other factors influencing organisational innovation are skill of workforce with a prevalence of 80% in both the formal and informal sector. The important technological drivers include educational background of the manager/head of technical department (54%) for organisational innovation and (50%) for process innovation in the formal sector. The study concluded that innovation competence of the firms was mostly organisational changes.

Keywords: innovation prevalence, innovation activities, innovation performance, innovation drivers

Procedia PDF Downloads 373
2054 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand

Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones

Abstract:

As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.

Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem

Procedia PDF Downloads 243
2053 Stochastic Modeling for Parameters of Modified Car-Following Model in Area-Based Traffic Flow

Authors: N. C. Sarkar, A. Bhaskar, Z. Zheng

Abstract:

The driving behavior in area-based (i.e., non-lane based) traffic is induced by the presence of other individuals in the choice space from the driver’s visual perception area. The driving behavior of a subject vehicle is constrained by the potential leaders and leaders are frequently changed over time. This paper is to determine a stochastic model for a parameter of modified intelligent driver model (MIDM) in area-based traffic (as in developing countries). The parametric and non-parametric distributions are presented to fit the parameters of MIDM. The goodness of fit for each parameter is measured in two different ways such as graphically and statistically. The quantile-quantile (Q-Q) plot is used for a graphical representation of a theoretical distribution to model a parameter and the Kolmogorov-Smirnov (K-S) test is used for a statistical measure of fitness for a parameter with a theoretical distribution. The distributions are performed on a set of estimated parameters of MIDM. The parameters are estimated on the real vehicle trajectory data from India. The fitness of each parameter with a stochastic model is well represented. The results support the applicability of the proposed modeling for parameters of MIDM in area-based traffic flow simulation.

Keywords: area-based traffic, car-following model, micro-simulation, stochastic modeling

Procedia PDF Downloads 145