Search results for: real time pest tracking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21491

Search results for: real time pest tracking

17021 MAGNI Dynamics: A Vision-Based Kinematic and Dynamic Upper-Limb Model for Intelligent Robotic Rehabilitation

Authors: Alexandros Lioulemes, Michail Theofanidis, Varun Kanal, Konstantinos Tsiakas, Maher Abujelala, Chris Collander, William B. Townsend, Angie Boisselle, Fillia Makedon

Abstract:

This paper presents a home-based robot-rehabilitation instrument, called ”MAGNI Dynamics”, that utilized a vision-based kinematic/dynamic module and an adaptive haptic feedback controller. The system is expected to provide personalized rehabilitation by adjusting its resistive and supportive behavior according to a fuzzy intelligence controller that acts as an inference system, which correlates the user’s performance to different stiffness factors. The vision module uses the Kinect’s skeletal tracking to monitor the user’s effort in an unobtrusive and safe way, by estimating the torque that affects the user’s arm. The system’s torque estimations are justified by capturing electromyographic data from primitive hand motions (Shoulder Abduction and Shoulder Forward Flexion). Moreover, we present and analyze how the Barrett WAM generates a force-field with a haptic controller to support or challenge the users. Experiments show that by shifting the proportional value, that corresponds to different stiffness factors of the haptic path, can potentially help the user to improve his/her motor skills. Finally, potential areas for future research are discussed, that address how a rehabilitation robotic framework may include multisensing data, to improve the user’s recovery process.

Keywords: human-robot interaction, kinect, kinematics, dynamics, haptic control, rehabilitation robotics, artificial intelligence

Procedia PDF Downloads 329
17020 Implementation of a Method of Crater Detection Using Principal Component Analysis in FPGA

Authors: Izuru Nomura, Tatsuya Takino, Yuji Kageyama, Shin Nagata, Hiroyuki Kamata

Abstract:

We propose a method of crater detection from the image of the lunar surface captured by the small space probe. We use the principal component analysis (PCA) to detect craters. Nevertheless, considering severe environment of the space, it is impossible to use generic computer in practice. Accordingly, we have to implement the method in FPGA. This paper compares FPGA and generic computer by the processing time of a method of crater detection using principal component analysis.

Keywords: crater, PCA, eigenvector, strength value, FPGA, processing time

Procedia PDF Downloads 555
17019 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology

Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey

Abstract:

In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.

Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography

Procedia PDF Downloads 85
17018 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 127
17017 Identification Strategies for Unknown Victims from Mass Disasters and Unknown Perpetrators from Violent Crime or Terrorist Attacks

Authors: Michael Josef Schwerer

Abstract:

Background: The identification of unknown victims from mass disasters, violent crimes, or terrorist attacks is frequently facilitated through information from missing persons lists, portrait photos, old or recent pictures showing unique characteristics of a person such as scars or tattoos, or simply reference samples from blood relatives for DNA analysis. In contrast, the identification or at least the characterization of an unknown perpetrator from criminal or terrorist actions remains challenging, particularly in the absence of material or data for comparison, such as fingerprints, which had been previously stored in criminal records. In scenarios that result in high levels of destruction of the perpetrator’s corpse, for instance, blast or fire events, the chance for a positive identification using standard techniques is further impaired. Objectives: This study shows the forensic genetic procedures in the Legal Medicine Service of the German Air Force for the identification of unknown individuals, including such cases in which reference samples are not available. Scenarios requiring such efforts predominantly involve aircraft crash investigations, which are routinely carried out by the German Air Force Centre of Aerospace Medicine as one of the Institution’s essential missions. Further, casework by military police or military intelligence is supported based on administrative cooperation. In the talk, data from study projects, as well as examples from real casework, will be demonstrated and discussed with the audience. Methods: Forensic genetic identification in our laboratories involves the analysis of Short Tandem Repeats and Single Nucleotide Polymorphisms in nuclear DNA along with mitochondrial DNA haplotyping. Extended DNA analysis involves phenotypic markers for skin, hair, and eye color together with the investigation of a person’s biogeographic ancestry. Assessment of the biological age of an individual employs CpG-island methylation analysis using bisulfite-converted DNA. Forensic Investigative Genealogy assessment allows the detection of an unknown person’s blood relatives in reference databases. Technically, end-point-PCR, real-time PCR, capillary electrophoresis, pyrosequencing as well as next generation sequencing using flow-cell-based and chip-based systems are used. Results and Discussion: Optimization of DNA extraction from various sources, including difficult matrixes like formalin-fixed, paraffin-embedded tissues, degraded specimens from decomposed bodies or from decedents exposed to blast or fire events, provides soil for successful PCR amplification and subsequent genetic profiling. For cases with extremely low yields of extracted DNA, whole genome preamplification protocols are successfully used, particularly regarding genetic phenotyping. Improved primer design for CpG-methylation analysis, together with validated sampling strategies for the analyzed substrates from, e.g., lymphocyte-rich organs, allows successful biological age estimation even in bodies with highly degraded tissue material. Conclusions: Successful identification of unknown individuals or at least their phenotypic characterization using pigmentation markers together with age-informative methylation profiles, possibly supplemented by family tree search employing Forensic Investigative Genealogy, can be provided in specialized laboratories. However, standard laboratory procedures must be adapted to work with difficult and highly degraded sample materials.

Keywords: identification, forensic genetics, phenotypic markers, CPG methylation, biological age estimation, forensic investigative genealogy

Procedia PDF Downloads 51
17016 Advancing Food System Resilience by Pseudocereals Utilization

Authors: Yevheniia Varyvoda, Douglas Taren

Abstract:

At the aggregate level, climate variability, the rising number of active violent conflicts, globalization and industrialization of agriculture, the loss in diversity of crop species, the increase in demand for agricultural production, and the adoption of healthy and sustainable dietary patterns are exacerbating factors of food system destabilization. The importance of pseudocereals to fuel and sustain resilient food systems is recognized by leading organizations working to end hunger, particularly for their critical capability to diversify livelihood portfolios and provide plant-sourced healthy nutrition in the face of systemic shocks and stresses. Amaranth, buckwheat, and quinoa are the most promising and used pseudocereals for ensuring food system resilience in the reality of climate change due to their high nutritional profile, good digestibility, palatability, medicinal value, abiotic stress tolerance, pest and disease resistance, rapid growth rate, adaptability to marginal and degraded lands, high genetic variability, low input requirements, and income generation capacity. The study provides the rationale and examples of advancing local and regional food systems' resilience by scaling up the utilization of amaranth, buckwheat, and quinoa along all components of food systems to architect indirect nutrition interventions and climate-smart approaches. Thus, this study aims to explore the drivers for ancient pseudocereal utilization, the potential resilience benefits that can be derived from using them, and the challenges and opportunities for pseudocereal utilization within the food system components. The PSALSAR framework regarding the method for conducting systematic review and meta-analysis for environmental science research was used to answer these research questions. Nevertheless, the utilization of pseudocereals has been slow for a number of reasons, namely the increased production of commercial and major staples such as maize, rice, wheat, soybean, and potato, the displacement due to pressure from imported crops, lack of knowledge about value-adding practices in food supply chain, limited technical knowledge and awareness about nutritional and health benefits, absence of marketing channels and limited access to extension services and information about resilient crops. The success of climate-resilient pathways based on pseudocereal utilization underlines the importance of co-designed activities that use modern technologies, high-value traditional knowledge of underutilized crops, and a strong acknowledgment of cultural norms to increase community-level economic and food system resilience.

Keywords: resilience, pseudocereals, food system, climate change

Procedia PDF Downloads 79
17015 GBKMeans: A Genetic Based K-Means Applied to the Capacitated Planning of Reading Units

Authors: Anderson S. Fonseca, Italo F. S. Da Silva, Robert D. A. Santos, Mayara G. Da Silva, Pedro H. C. Vieira, Antonio M. S. Sobrinho, Victor H. B. Lemos, Petterson S. Diniz, Anselmo C. Paiva, Eliana M. G. Monteiro

Abstract:

In Brazil, the National Electric Energy Agency (ANEEL) establishes that electrical energy companies are responsible for measuring and billing their customers. Among these regulations, it’s defined that a company must bill your customers within 27-33 days. If a relocation or a change of period is required, the consumer must be notified in writing, in advance of a billing period. To make it easier to organize a workday’s measurements, these companies create a reading plan. These plans consist of grouping customers into reading groups, which are visited by an employee responsible for measuring consumption and billing. The creation process of a plan efficiently and optimally is a capacitated clustering problem with constraints related to homogeneity and compactness, that is, the employee’s working load and the geographical position of the consuming unit. This process is a work done manually by several experts who have experience in the geographic formation of the region, which takes a large number of days to complete the final planning, and because it’s human activity, there is no guarantee of finding the best optimization for planning. In this paper, the GBKMeans method presents a technique based on K-Means and genetic algorithms for creating a capacitated cluster that respects the constraints established in an efficient and balanced manner, that minimizes the cost of relocating consumer units and the time required for final planning creation. The results obtained by the presented method are compared with the current planning of a real city, showing an improvement of 54.71% in the standard deviation of working load and 11.97% in the compactness of the groups.

Keywords: capacitated clustering, k-means, genetic algorithm, districting problems

Procedia PDF Downloads 198
17014 Environmentally Adaptive Acoustic Echo Suppression for Barge-in Speech Recognition

Authors: Jong Han Joo, Jung Hoon Lee, Young Sun Kim, Jae Young Kang, Seung Ho Choi

Abstract:

In this study, we propose a novel technique for acoustic echo suppression (AES) during speech recognition under barge-in conditions. Conventional AES methods based on spectral subtraction apply fixed weights to the estimated echo path transfer function (EPTF) at the current signal segment and to the EPTF estimated until the previous time interval. We propose a new approach that adaptively updates weight parameters in response to abrupt changes in the acoustic environment due to background noises or double-talk. Furthermore, we devised a voice activity detector and an initial time-delay estimator for barge-in speech recognition in communication networks. The initial time delay is estimated using log-spectral distance measure, as well as cross-correlation coefficients. The experimental results show that the developed techniques can be successfully applied in barge-in speech recognition systems.

Keywords: acoustic echo suppression, barge-in, speech recognition, echo path transfer function, initial delay estimator, voice activity detector

Procedia PDF Downloads 372
17013 Cyrus Cylinder; A Law for His Future Time

Authors: Hasanzadeh Mehran

Abstract:

The Cyrus Cylinder, which is a baked clay tablet, was written in 539 BC by order of the Achaemenid king Cyrus. This clay tablet contains orders and is considered a historical document of the humanitarian behaviour of the victorious army during the conquest of Babylon. Some believe that these laws are the first declaration of human rights in the ancient world. After the conquest of Babylon, Cyrus created laws that had never been seen anywhere in history. For this reason, in this article it has been tried to mention the human aspects and the reasons and grounds for the formation of such laws at that time. The origin of the creation of these progressive and humanitarian laws in the Cyrus cylinder should be sought in the cultural roots of civilization and his social and individual teachings.

Keywords: Iran, cyrus, cyrus cylinder, human rights

Procedia PDF Downloads 96
17012 Association of Depression with Physical Inactivity and Time Watching Television: A Cross-Sectional Study with the Brazilian Population PNS, 2013

Authors: Margareth Guimaraes Lima, Marilisa Berti A. Barros, Deborah Carvalho Malta

Abstract:

The relationship between physical activity (PA) and depression has been investigated, in both, observational and clinical studies: PA can integrate the treatments for depression; the physical inactivity (PI) may contribute to increase depression symptoms; and on the other hand, emotional problems can decrease PA. The main of this study was analyze the association among leisure and transportation PI and time watching television (TV) according to depression (minor and major), evaluated with the Patient Health Questionnaire (PHQ-9). The association was also analyzed by gender. This is a cross-sectional study. Data were obtained from the National Health Survey 2013 (PNS), performed with representative sample of the Brazilian adult population, in 2013. The PNS collected information from 60,202 individuals, aged 18 years or more. The independent variable were: leisure time physical inactivity (LTPI), considering inactive or insufficiently actives (categories were linked for analyzes), those who do not performed a minimum of 150 or 74 minutes of moderate or vigorous LTPA, respectively, by week; transportation physical inactivity (TPI), individuals who did not reached 150 minutes, by week, travelling by bicycle or on foot to work or other activities; daily time watching TV > 5 hours. The principal independent variable was depression, identified by PHQ-9. Individuals were classified with major depression, with > 5 symptoms, more than seven days, but one of the symptoms was “depressive mood” or “lack of interest or pleasure”. The others had minor depression. The variables used to adjustment were gender, age, schooling and chronic disease. The prevalence of LTPI, TPI and TV time were estimated according to depression, and differences were tested with Chi-Square test. Adjusted prevalence ratios were estimated using multiple Poisson regression models. The analyzes also had stratification by gender. Mean age of the studied population was 42.9 years old (CI95%:42.6-43.2) and 52.9% were women. 77.5% and 68.1% were inactive or insufficiently active in leisure and transportation, respectively and 13.3% spent time watching TV 5 > hours. 6% and 4.1% of the Brazilian population were diagnosed with minor or major depression. LTPI prevalence was 5% and 9% higher among individuals with minor and major depression, respectively, comparing with no depression. The prevalence of TPI was 7% higher in those with major depression. Considering larger time watching TV, the prevalence was 45% and 74% higher among those with minor and major depression, respectively. Analyzing by gender, the associations were greater in men than women and TPI was note be associated, in women. The study detected the higher prevalence of leisure time physical inactivity and, especially, time spent watching TV, among individuals with major and minor depression, after to adjust for a number of potential confounding factors. TPI was only associated with major disorders and among men. Considering the cross-sectional design of the research, these associations can point out the importance of the mental problems control of the population to increase PA and decrease the sedentary lifestyle; on the other hand, the study highlight the need of interventions by encouraging people with depression, to practice PA, even to transportation.

Keywords: depression, physical activity, PHQ-9, sedentary lifestyle

Procedia PDF Downloads 156
17011 A Low-Cost of Foot Plantar Shoes for Gait Analysis

Authors: Zulkifli Ahmad, Mohd Razlan Azizan, Nasrul Hadi Johari

Abstract:

This paper presents a study on development and conducting of a wearable sensor system for gait analysis measurement. For validation, the method of plantar surface measurement by force plate was prepared. In general gait analysis, force plate generally represents a studies about barefoot in whole steps and do not allow analysis of repeating movement step in normal walking and running. The measurements that were usually perform do not represent the whole daily plantar pressures in the shoe insole and only obtain the ground reaction force. The force plate measurement is usually limited a few step and it is done indoor and obtaining coupling information from both feet during walking is not easily obtained. Nowadays, in order to measure pressure for a large number of steps and obtain pressure in each insole part, it could be done by placing sensors within an insole. With this method, it will provide a method for determine the plantar pressures while standing, walking or running of a shoe wearing subject. Inserting pressure sensors in the insole will provide specific information and therefore the point of the sensor placement will result in obtaining the critical part under the insole. In the wearable shoe sensor project, the device consists left and right shoe insole with ten FSR. Arduino Mega was used as a micro-controller that read the analog input from FSR. The analog inputs were transmitted via bluetooth data transmission that gains the force data in real time on smartphone. Blueterm software which is an android application was used as an interface to read the FSR reading on the shoe wearing subject. The subject consist of two healthy men with different age and weight doing test while standing, walking (1.5 m/s), jogging (5 m/s) and running (9 m/s) on treadmill. The data obtain will be saved on the android device and for making an analysis and comparison graph.

Keywords: gait analysis, plantar pressure, force plate, earable sensor

Procedia PDF Downloads 453
17010 Computational Fluid Dynamics (CFD) Simulation Approach for Developing New Powder Dispensing Device

Authors: Revanth Rallapalli

Abstract:

Manually dispensing solids and powders can be difficult as it requires gradually pour and check the amount on the scale to be dispensed. Current systems are manual and non-continuous in nature and are user-dependent and difficult to control powder dispensation. Recurrent dosing of powdered medicines in precise amounts quickly and accurately has been an all-time challenge. Various new powder dispensing mechanisms are being designed to overcome these challenges. A battery-operated screw conveyor mechanism is being innovated to overcome the above problems faced. These inventions are numerically evaluated at the concept development level by employing Computational Fluid Dynamics (CFD) of gas-solids multiphase flow systems. CFD has been very helpful in development of such devices saving time and money by reducing the number of prototypes and testing. Furthermore, this paper describes a simulation of powder dispensation from the trocar’s end by considering the powder as secondary flow in air, is simulated by using the technique called Dense Discrete Phase Model incorporated with Kinetic Theory of Granular Flow (DDPM-KTGF). By considering the volume fraction of powder as 50%, the transportation of powder from the inlet side to trocar’s end side is done by rotation of the screw conveyor. Thus, the performance is calculated for a 1-sec time frame in an unsteady computation manner. This methodology will help designers in developing design concepts to improve the dispensation and also at the effective area within a quick turnaround time frame.

Keywords: DDPM-KTGF, gas-solids multiphase flow, screw conveyor, Unsteady

Procedia PDF Downloads 180
17009 Sharp Estimates of Oscillatory Singular Integrals with Rough Kernels

Authors: H. Al-Qassem, L. Cheng, Y. Pan

Abstract:

In this paper, we establish sharp bounds for oscillatory singular integrals with an arbitrary real polynomial phase P. Our kernels are allowed to be rough both on the unit sphere and in the radial direction. We show that the bounds grow no faster than log (deg(P)), which is optimal and was first obtained by Parissis and Papadimitrakis for kernels without any radial roughness. Our results substantially improve many previously known results. Among key ingredients of our methods are an L¹→L² sharp estimate and using extrapolation.

Keywords: oscillatory singular integral, rough kernel, singular integral, orlicz spaces, block spaces, extrapolation, L^{p} boundedness

Procedia PDF Downloads 456
17008 Effects of Clozapine and Risperidone Antipsychotic Drugs on the Expression of CACNA1C and Behavioral Changes in Rat ‘Ketamine Model of Schizophrenia

Authors: Mehrnoosh Azimi Sanavi, Hamed Ghazvini, Mehryar Zargari, Hossein Ghalehnoei, Zahra Hosseini-khah

Abstract:

Objectives: Calcium Voltage-Gated Channel Subunit Alpha1 C (CACNA1C) is one of the most important genes associated with schizophrenia. Methods: 45 male Wistar rats were divided into 5 groups: saline, control, ketamine, clozapine, and risperidone. Animals in ketamine, risperidone, and clozapine groups received ketamine (30 mg/ kg-i.p.) for 10 days. After the last injection of ketamine, we started injecting clozapine (7.5 mg/kg-i.p.) risperidone (1 mg/kg-i.p.) for up to 28 days. Twenty-four hours after the last injection, open field, social interaction, and elevated plus-maze tests, and gene expression in the hippocampus were performed. Results: The results of the social interaction test revealed a significant decrease in cumulative time with ketamine compared with the saline group and an increase with clozapine and risperidone compared with the ketamine group. Moreover, results from the elevated plus-maze test demonstrated a critical decrease in open-arm time and an increase in close-arm time with ketamine compared with saline, as well as an increase in open-arm time with risperidone compared with ketamine. Further results revealed a significant increase in rearing and grooming with ketamine compared to saline, as well as a decrease with risperidone and clozapine compared to ketamine. There were no significant differences in CACNA1C gene expression between groups in the rat hippocampus. In brief, the results of this study indicated that clozapine and risperidone could partially improve cognitive impairments in the rat. However, our findings demonstrated that this treatment is not related to CACNA1C gene expression.

Keywords: schizophrenia, ketamine, clozapine, risperidone

Procedia PDF Downloads 63
17007 The Safety Transfer in Acute Critical Patient by Telemedicine (START) Program at Udonthani General Hospital

Authors: Wisit Wichitkosoom

Abstract:

Objective:The majority of the hisk-risk patients (ST-elevation myocardial infarction (STEMI), Acute cerebrovascular accident, Sepsis, Acute Traumatic patient ) are admitted to district or lacal hospitals (average 1-1.30 hr. from Udonthani general hospital, Northeastern province, Thailand) without proper facilities. The referral system was support to early care and early management at pre-hospital stage and prepare for the patient data to higher hospital. This study assessed the reduction in treatment delay achieved by pre-hospital diagnosis and referral directly to Udonthani General Hospital. Methods and results: Four district or local hospitals without proper facilities for treatment the very high-risk patient were serving the study region. Pre-hospital diagnoses were established with the simple technology such as LINE, SMS, telephone and Fax for concept of LEAN process and then the telemedicine, by ambulance monitoring (ECG, SpO2, BT, BP) in both real time and snapshot mode was administrated during the period of transfer for safety transfer concept (inter-hospital stage). The standard treatment for patients with STEMI, Intracranial injury and acute cerebrovascular accident were done. From 1 October 2012 to 30 September 2013, the 892 high-risk patients transported by ambulance and transferred to Udonthani general hospital were registered. Patients with STEMI diagnosed pre-hospitally and referred directly to the Udonthani general hospital with telemedicine closed monitor (n=248). The mortality rate decreased from 11.69% in 2011 to 6.92 in 2012. The 34 patients were arrested on the way and successful to CPR during transfer with the telemedicine consultation were 79.41%. Conclusion: The proper innovation could apply for health care system. The very high-risk patients must had the closed monitoring with two-way communication for the “safety transfer period”. It could modified to another high-risk group too.

Keywords: safety transfer, telemedicine, critical patients, medical and health sciences

Procedia PDF Downloads 306
17006 A Metaheuristic Approach for the Pollution-Routing Problem

Authors: P. Parthiban, Sonu Rajak, R. Dhanalakshmi

Abstract:

This paper presents an Ant Colony Optimization (ACO) approach, combined with a Speed Optimization Algorithm (SOA) to solve the Vehicle Routing Problem (VRP) with environmental considerations, which is well known as Pollution-Routing Problem (PRP). It consists of routing a number of vehicles to serve a set of customers, and determining fuel consumption, driver wages and their speed on each route segment, while respecting the capacity constraints and time windows. Since VRP is NP-hard problem, so PRP also a NP-hard problem, which requires metaheuristics to solve this type of problems. The proposed solution method consists of two stages. Stage one is to solve a Vehicle Routing Problem with Time Window (VRPTW) using ACO and in the second stage, a SOA is run on the resulting VRPTW solution. Given a vehicle route, the SOA consists of finding the optimal speed on each arc of the route to minimize an objective function comprising fuel consumption costs and driver wages. The proposed algorithm tested on benchmark problem, the preliminary results show that the proposed algorithm can provide good solutions within reasonable computational time.

Keywords: ant colony optimization, CO2 emissions, speed optimization, vehicle routing

Procedia PDF Downloads 360
17005 A New Distribution and Application on the Lifetime Data

Authors: Gamze Ozel, Selen Cakmakyapan

Abstract:

We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of real life data set.

Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood

Procedia PDF Downloads 501
17004 A Goal-Oriented Social Business Process Management Framework

Authors: Mohammad Ehson Rangiha, Bill Karakostas

Abstract:

Social Business Process Management (SBPM) promises to overcome limitations of traditional BPM by allowing flexible process design and enactment through the involvement of users from a social community. This paper proposes a meta-model and architecture for socially driven business process management systems. It discusses the main facets of the architecture such as goal-based role assignment that combines social recommendations with user profile, and process recommendation, through a real example of a charity organization.

Keywords: business process management, goal-based modelling, process recommendation social collaboration, social BPM

Procedia PDF Downloads 494
17003 Investigation of Electrochemical, Morphological, Rheological and Mechanical Properties of Nano-Layered Graphene/Zinc Nanoparticles Incorporated Cold Galvanizing Compound at Reduced Pigment Volume Concentration

Authors: Muhammad Abid

Abstract:

The ultimate goal of this research was to produce a cold galvanizing compound (CGC) at reduced pigment volume concentration (PVC) to protect metallic structures from corrosion. The influence of the partial replacement of Zn dust by nano-layered graphene (NGr) and Zn metal nanoparticles on the electrochemical, morphological, rheological, and mechanical properties of CGC was investigated. EIS was used to explore the electrochemical nature of coatings. The EIS results revealed that the partial replacement of Zn by NGr and Zn nanoparticles enhanced the cathodic protection at reduced PVC (4:1) by improving the electrical contact between the Zn particles and the metal substrate. The Tafel scan was conducted to support the cathodic behaviour of the coatings. The sample formulated solely with Zn at PVC 4:1 was found to be dominated in physical barrier characteristics over cathodic protection. By increasing the concentration of NGr in the formulation, the corrosion potential shifted towards a more negative side. The coating with 1.5% NGr showed the highest galvanic action at reduced PVC. FE-SEM confirmed the interconnected network of conducting particles. The coating without NGr and Zn nanoparticles at PVC 4:1 showed significant gaps between the Zn dust particles. The novelty was evidenced when micrographs showed the consistent distribution of NGr and Zn nanoparticles all over the surface, which acted as a bridge between spherical Zn particles and provided cathodic protection at a reduced PVC. The layered structure of graphene also improved the physical shielding effect of the coatings, which limited the diffusion of electrolytes and corrosion products (oxides/hydroxides) into the coatings, which was reflected by the salt spray test. The rheological properties of coatings showed good liquid/fluid properties. All the coatings showed excellent adhesion but had different strength values. A real-time scratch resistance assessment showed all the coatings had good scratch resistance.

Keywords: protective coatings, anti-corrosion, galvanization, graphene, nanomaterials, polymers

Procedia PDF Downloads 97
17002 The Importance of Localization in Large Constraction Projects

Authors: Ali Mohammadi

Abstract:

The basis for the construction of any project is a map, a map where the surveyor can determine the coordinates of the points on the ground by using the coordinates and using the total station, projects such as dams, roads, tunnels and pipelines, if the base points are determined using GPS prepared can create challenges for the surveyor to control. First, we will examine some map projection on which the maps are designed, and a summary of their differences and the challenges that surveyors face in order to control them, because in order to build projects, we need real lengths and angles, so we have to use coordinates that provide us with the results of the calculations. We will examine some examples to understand the concept of localization so that the surveyor knows if he is facing a challenge or not and if he is faced with this challenge, how should he solve this problem.

Keywords: UTM, scale factor, cartesian, traverse

Procedia PDF Downloads 81
17001 Design and Thermal Analysis of Power Harvesting System of a Hexagonal Shaped Small Spacecraft

Authors: Mansa Radhakrishnan, Anwar Ali, Muhammad Rizwan Mughal

Abstract:

Many universities around the world are working on modular and low budget architecture of small spacecraft to reduce the development cost of the overall system. This paper focuses on the design of a modular solar power harvesting system for a hexagonal-shaped small satellite. The designed solar power harvesting systems are composed of solar panels and power converter subsystems. The solar panel is composed of solar cells mounted on the external face of the printed circuit board (PCB), while the electronic components of power conversion are mounted on the interior side of the same PCB. The solar panel with dimensions 16.5cm × 99cm is composed of 36 solar cells (each solar cell is 4cm × 7cm) divided into four parallel banks where each bank consists of 9 solar cells. The output voltage of a single solar cell is 2.14V, and the combined output voltage of 9 series connected solar cells is around 19.3V. The output voltage of the solar panel is boosted to the satellite power distribution bus voltage level (28V) by a boost converter working on a constant voltage maximum power point tracking (MPPT) technique. The solar panel module is an eight-layer PCB having embedded coil in 4 internal layers. This coil is used to control the attitude of the spacecraft, which consumes power to generate a magnetic field and rotate the spacecraft. As power converter and distribution subsystem components are mounted on the PCB internal layer, therefore it is mandatory to do thermal analysis in order to ensure that the overall module temperature is within thermal safety limits. The main focus of the overall design is on compactness, miniaturization, and efficiency enhancement.

Keywords: small satellites, power subsystem, efficiency, MPPT

Procedia PDF Downloads 74
17000 Factors That Influence Willingness to Pay for Theatre Performances: The Case of Lithuanian National Drama Theatre

Authors: Rusne Kregzdaite

Abstract:

The value of the cultural sector stems from the symbolic exploration that differentiates cultural organisations from other product or service organisations. As a result, the cultural sector has a dual impact on the socio-economic system: the economic value (expressed in terms of market relations) created influences the dynamics of the country's financial indicators, while the cultural (non-market) value indirectly contributes to the welfare of the state through changes in societal values, creativity transformations and cultural needs of the country. Measurement of indirect (cultural value) impacts is difficult, but in the case of the cultural sector (especially when it comes to economically inefficient state-funded culture), it helps to reveal the essential characteristics of the sector. The study aims to analyze the value of cultural organisations that are invisible in market processes and to base it on quantified calculations. This was be done by analyzing the usefulness of the consumer, incorporating not only the price paid but also the social and cultural decision-making factors that determine the spectator's choice (time dedicated for a visit, additional costs, content, previous experiences, corporate image). This may reflect the consumer's real choice to consume (all the costs he incurs may be considered the financial equivalent of his experience with the cultural establishment). The research methodology was tested by analyzing the performing arts sector and applying methods to the Lithuanian national drama theatre case. The empirical research consisted of a survey (more than 800 participants) of Lithuanian national drama theatre visitors to different performances. The willingness to pay and travel costs methods were used. Analysis of different performances lets identifies the factor that increases willingness to pay for the performance and affects theatre attendance. The research stresses the importance of cultural value and social perspective of the cultural sector and relates it to the discussions of public funding of culture.

Keywords: cultural economics, performing arts, willingness to pay, travel cost analysis, performing arts management

Procedia PDF Downloads 89
16999 Automatic Checkpoint System Using Face and Card Information

Authors: Kriddikorn Kaewwongsri, Nikom Suvonvorn

Abstract:

In the deep south of Thailand, checkpoints for people verification are necessary for the security management of risk zones, such as official buildings in the conflict area. In this paper, we propose an automatic checkpoint system that verifies persons using information from ID cards and facial features. The methods for a person’s information abstraction and verification are introduced based on useful information such as ID number and name, extracted from official cards, and facial images from videos. The proposed system shows promising results and has a real impact on the local society.

Keywords: face comparison, card recognition, OCR, checkpoint system, authentication

Procedia PDF Downloads 321
16998 Differences in Motivations for the Use of Facebook between Males and Females

Authors: Arti Bakhshi, Remia Mahajan

Abstract:

Social networking sites have evolved with great pace and India has been no exception. Facebook is the top most rated social networking site (SNS) in India. Though this site is mostly used by younger generations, the popularity of this site is increasing among all masses and classes. The current paper explores gender differences in motivations for the use of Facebook. Of the sample (N=556), 229 male and 327 female Facebook users from India were asked to rate the motivations for the use of Facebook from ‘most preferred’ to ‘least preferred’. The five motivations studied were- time passing, information, relationship development, relationship maintenance and trend following. The cross tab chi square analyses revealed significant differences in three out of five motivations between male and female Facebook users, namely time passing, relationship development and trend following. Female Facebook users rated ‘time passing’ as a more preferred motivation in comparison to male Facebook users, while male users rated ‘relationship development’ and ‘trend following’ motivations as more preferred in comparison to female Facebook users. Suggestions for future research are discussed.

Keywords: facebook, gender, motivations, social networking sites

Procedia PDF Downloads 471
16997 Dynamic Two-Way FSI Simulation for a Blade of a Small Wind Turbine

Authors: Alberto Jiménez-Vargas, Manuel de Jesús Palacios-Gallegos, Miguel Ángel Hernández-López, Rafael Campos-Amezcua, Julio Cesar Solís-Sanchez

Abstract:

An optimal wind turbine blade design must be able of capturing as much energy as possible from the wind source available at the area of interest. Many times, an optimal design means the use of large quantities of material and complicated processes that make the wind turbine more expensive, and therefore, less cost-effective. For the construction and installation of a wind turbine, the blades may cost up to 20% of the outline pricing, and become more important due to they are part of the rotor system that is in charge of transmitting the energy from the wind to the power train, and where the static and dynamic design loads for the whole wind turbine are produced. The aim of this work is the develop of a blade fluid-structure interaction (FSI) simulation that allows the identification of the major damage zones during the normal production situation, and thus better decisions for design and optimization can be taken. The simulation is a dynamic case, since we have a time-history wind velocity as inlet condition instead of a constant wind velocity. The process begins with the free-use software NuMAD (NREL), to model the blade and assign material properties to the blade, then the 3D model is exported to ANSYS Workbench platform where before setting the FSI system, a modal analysis is made for identification of natural frequencies and modal shapes. FSI analysis is carried out with the two-way technic which begins with a CFD simulation to obtain the pressure distribution on the blade surface, then these results are used as boundary condition for the FEA simulation to obtain the deformation levels for the first time-step. For the second time-step, CFD simulation is reconfigured automatically with the next time-step inlet wind velocity and the deformation results from the previous time-step. The analysis continues the iterative cycle solving time-step by time-step until the entire load case is completed. This work is part of a set of projects that are managed by a national consortium called “CEMIE-Eólico” (Mexican Center in Wind Energy Research), created for strengthen technological and scientific capacities, the promotion of creation of specialized human resources, and to link the academic with private sector in national territory. The analysis belongs to the design of a rotor system for a 5 kW wind turbine design thought to be installed at the Isthmus of Tehuantepec, Oaxaca, Mexico.

Keywords: blade, dynamic, fsi, wind turbine

Procedia PDF Downloads 482
16996 Analyzing Boson Star as a Candidate for Dark Galaxy Using ADM Formulation of General Relativity

Authors: Aria Ratmandanu

Abstract:

Boson stars can be viewed as zero temperature ground state, Bose-Einstein condensates, characterized by enormous occupation numbers. Time-dependent spherically symmetric spacetime can be a model of Boson Star. We use (3+1) split of Einstein equation (ADM formulation of general relativity) to solve Einstein field equation coupled to a complex scalar field (Einstein-Klein-Gordon Equation) on time-dependent spherically symmetric spacetime, We get the result that Boson stars are pulsating stars with the frequency of oscillation equal to its density. We search for interior solution of Boson stars and get the T.O.V. (Tollman-Oppenheimer-Volkoff) equation for Boson stars. Using T.O.V. equation, we get the equation of state and the relation between pressure and density, its total mass and along with its gravitational Mass. We found that the hypothetical particle Axion could form a Boson star with the size of a milky way galaxy and make it a candidate for a dark galaxy, (a galaxy that consists almost entirely of dark matter).

Keywords: axion, boson star, dark galaxy, time-dependent spherically symmetric spacetime

Procedia PDF Downloads 244
16995 Design of a Fuzzy Expert System for the Impact of Diabetes Mellitus on Cardiac and Renal Impediments

Authors: E. Rama Devi Jothilingam

Abstract:

Diabetes mellitus is now one of the most common non communicable diseases globally. India leads the world with largest number of diabetic subjects earning the title "diabetes capital of the world". In order to reduce the mortality rate, a fuzzy expert system is designed to predict the severity of cardiac and renal problems of diabetic patients using fuzzy logic. Since uncertainty is inherent in medicine, fuzzy logic is used in this research work to remove the inherent fuzziness of linguistic concepts and uncertain status in diabetes mellitus which is the prime cause for the cardiac arrest and renal failure. In this work, the controllable risk factors "blood sugar, insulin, ketones, lipids, obesity, blood pressure and protein/creatinine ratio" are considered as input parameters and the "the stages of cardiac" (SOC)" and the stages of renal" (SORD) are considered as the output parameters. The triangular membership functions are used to model the input and output parameters. The rule base is constructed for the proposed expert system based on the knowledge from the medical experts. Mamdani inference engine is used to infer the information based on the rule base to take major decision in diagnosis. Mean of maximum is used to get a non fuzzy control action that best represent possibility distribution of an inferred fuzzy control action. The proposed system also classifies the patients with high risk and low risk using fuzzy c means clustering techniques so that the patients with high risk are treated immediately. The system is validated with Matlab and is used as a tracking system with accuracy and robustness.

Keywords: Diabetes mellitus, fuzzy expert system, Mamdani, MATLAB

Procedia PDF Downloads 290
16994 Optimal Maintenance Clustering for Rail Track Components Subject to Possession Capacity Constraints

Authors: Cuong D. Dao, Rob J.I. Basten, Andreas Hartmann

Abstract:

This paper studies the optimal maintenance planning of preventive maintenance and renewal activities for components in a single railway track when the available time for maintenance is limited. The rail-track system consists of several types of components, such as rail, ballast, and switches with different preventive maintenance and renewal intervals. To perform maintenance or renewal on the track, a train free period for maintenance, called a possession, is required. Since a major possession directly affects the regular train schedule, maintenance and renewal activities are clustered as much as possible. In a highly dense and utilized railway network, the possession time on the track is critical since the demand for train operations is very high and a long possession has a severe impact on the regular train schedule. We present an optimization model and investigate the maintenance schedules with and without the possession capacity constraint. In addition, we also integrate the social-economic cost related to the effects of the maintenance time to the variable possession cost into the optimization model. A numerical example is provided to illustrate the model.

Keywords: rail-track components, maintenance, optimal clustering, possession capacity

Procedia PDF Downloads 262
16993 Wheeled Robot Stable Braking Process under Asymmetric Traction Coefficients

Authors: Boguslaw Schreyer

Abstract:

During the wheeled robot’s braking process, the extra dynamic vertical forces act on all wheels: left, right, front or rear. Those forces are directed downward on the front wheels while directed upward on the rear wheels. In order to maximize the deceleration, therefore, minimize the braking time and braking distance, we need to calculate a correct torque distribution: the front braking torque should be increased, and rear torque should be decreased. At the same time, we need to provide better transversal stability. In a simple case of all adhesion coefficients being the same under all wheels, the torque distribution may secure the optimal (maximal) control of the robot braking process, securing the minimum braking distance and a minimum braking time. At the same time, the transversal stability is relatively good. At any time, we control the transversal acceleration. In the case of the transversal movement, we stop the braking process and re-apply braking torque after a defined period of time. If we correctly calculate the value of the torques, we may secure the traction coefficient under the front and rear wheels close to its maximum. Also, in order to provide an optimum braking control, we need to calculate the timing of the braking torque application and the timing of its release. The braking torques should be released shortly after the wheels passed a maximum traction coefficient (while a wheels’ slip increases) and applied again after the wheels pass a maximum of traction coefficient (while the slip decreases). The correct braking torque distribution secures the front and rear wheels, passing this maximum at the same time. It guarantees an optimum deceleration control, therefore, minimum braking time. In order to calculate a correct torque distribution, a control unit should receive the input signals of a rear torque value (which changes independently), the robot’s deceleration, and values of the vertical front and rear forces. In order to calculate the timing of torque application and torque release, more signals are needed: speed of the robot: angular speed, and angular deceleration of the wheels. In case of different adhesion coefficients under the left and right wheels, but the same under each pair of wheels- the same under right wheels and the same under left wheels, the Select-Low (SL) and select high (SH) methods are applied. The SL method is suggested if transversal stability is more important than braking efficiency. Often in the case of the robot, more important is braking efficiency; therefore, the SH method is applied with some control of the transversal stability. In the case that all adhesion coefficients are different under all wheels, the front-rear torque distribution is maintained as in all previous cases. However, the timing of the braking torque application and release is controlled by the rear wheels’ lowest adhesion coefficient. The Lagrange equations have been used to describe robot dynamics. Matlab has been used in order to simulate the process of wheeled robot braking, and in conclusion, the braking methods have been selected.

Keywords: wheeled robots, braking, traction coefficient, asymmetric

Procedia PDF Downloads 165
16992 A Comprehensive Evaluation of IGBTs Performance under Zero Current Switching

Authors: Ly. Benbahouche

Abstract:

Currently, several soft switching topologies have been studied to achieve high power switching efficiency, reduced cost, improved reliability and reduced parasites. It is well known that improvement in power electronics systems always depend on advanced in power devices. The IGBT has been successfully used in a variety of switching applications such as motor drives and appliance control because of its superior characteristics. The aim of this paper is focuses on simulation and explication of the internal dynamics of IGBTs behaviour under the most popular soft switching schemas that is Zero Current Switching (ZCS) environments. The main purpose of this paper is to point out some mechanisms relating to current tail during the turn-off and examination of the response at turn-off with variation of temperature, inductance L, snubber capacitors Cs, and bus voltage in order to achieve an improved understanding of internal carrier dynamics. It is shown that the snubber capacitor, the inductance and even the temperature controls the magnitude and extent of the tail current, hence the turn-off time (switching speed of the device). Moreover, it has also been demonstrated that the ZCS switching can be utilized efficiently to improve and reduce the power losses as well as the turn-off time. Furthermore, the turn-off loss in ZCS was found to depend on the time of switching of the device.

Keywords: PT-IGBT, ZCS, turn-off losses, dV/dt

Procedia PDF Downloads 316