Search results for: highly accurate
6474 Internet Based Teleoperation of the Quad Rotor with Force Feedback Using Smith Predictor
Authors: K. Senthil Kumar, A. Vasumalaikannan
Abstract:
In this paper, teleoperation of the quadrotor using Internet with Force feedback is addressed. Teleoperation with Force feedback is the ability to remotely control a robot, where contact (obstacle) or environment (wind gust etc) information (force feedback) is communicated from the quadrotor to the master joystick and thus giving the operator a sense of telepresence. The stability and performance of such a teleoperator is highly dependent on the amount of time delay present in the control loop. This problem is further complicated given the fact that for network based communication the time delay is itself time varying and highly non deterministic. In this paper, a novel method using Neural based Smith Predictor at the master side the stability is achieved. The performance of the system even during worst case scenario is within acceptable.Keywords: teleoperation, quadrotor, neural smith predictor, time delay
Procedia PDF Downloads 6136473 Analysis of CO₂ Capture Products from Carbon Capture and Utilization Plant
Authors: Bongjae Lee, Beom Goo Hwang, Hye Mi Park
Abstract:
CO₂ capture products manufactured through Carbon Capture and Utilization (CCU) Plant that collect CO₂ directly from power plants require accurate measurements of the amount of CO₂ captured. For this purpose, two tests were carried out on the weight loss test. And one was analyzed using a carbon dioxide quantification device. First, the ignition loss analysis was performed by measuring the weight of the sample at 550°C after the first conversation and then confirming the loss when ignited at 950°C. Second, in the thermogravimetric analysis, the sample was divided into two sections of 40 to 500°C and 500 to 800°C to confirm the reduction. The results of thermal weight loss analysis and thermogravimetric analysis were confirmed to be almost similar. However, the temperature of the ignition loss analysis method was 950°C, which was 150°C higher than that of the thermogravimetric method at a temperature of 800°C, so that the difference in the amount of weight loss was 3 to 4% higher by the heat loss analysis method. In addition, the tendency that the CO₂ content increases as the reaction time become longer is similarly confirmed. Third, the results of the wet titration method through the carbon dioxide quantification device were found to be significantly lower than the weight loss method. Therefore, based on the results obtained through the above three analysis methods, we will establish a method to analyze the accurate amount of CO₂. Acknowledgements: This work was supported by the Korea Institute of Energy Technology Evaluation and planning (No. 20152010201850).Keywords: carbon capture and utilization, CCU, CO2, CO2 capture products, analysis method
Procedia PDF Downloads 2156472 Railway Accidents: Using the Global Railway Accident Database and Evaluation for Risk Analysis
Authors: Mathias Linden, André Schneider, Harald F. O. von Korflesch
Abstract:
The risk of train accidents is an ongoing concern for railway organizations, governments, insurance companies and other depended sectors. Safety technologies are installed to reduce and to prevent potential damages of train accidents. Since the budgetary for the safety of railway organizations is limited, it is necessary not only to achieve a high availability and high safety standard but also to be cost effective. Therefore, an economic assessment of safety technologies is fundamental to create an accurate risk analysis. In order to conduct an economical assessment of a railway safety technology and a quantification of the costs of the accident causes, the Global Railway Accident Database & Evaluation (GRADE) has been developed. The aim of this paper is to describe the structure of this accident database and to show how it can be used for risk analyses. A number of risk analysis methods, such as the probabilistic safety assessment method (PSA), was used to demonstrate this accident database’s different possibilities of risk analysis. In conclusion, it can be noted that these analyses would not be as accurate without GRADE. The information gathered in the accident database was not available in this way before. Our findings are relevant for railway operators, safety technology suppliers, assurances, governments and other concerned railway organizations.Keywords: accident causes, accident costs, accident database, global railway accident database & evaluation, GRADE, probabilistic safety assessment, PSA, railway accidents, risk analysis
Procedia PDF Downloads 3576471 A Study of Population Growth Models and Future Population of India
Authors: Sheena K. J., Jyoti Badge, Sayed Mohammed Zeeshan
Abstract:
A Comparative Study of Exponential and Logistic Population Growth Models in India India is the second most populous city in the world, just behind China, and is going to be in the first place by next year. The Indian population has remarkably at higher rate than the other countries from the past 20 years. There were many scientists and demographers who has formulated various models of population growth in order to study and predict the future population. Some of the models are Fibonacci population growth model, Exponential growth model, Logistic growth model, Lotka-Volterra model, etc. These models have been effective in the past to an extent in predicting the population. However, it is essential to have a detailed comparative study between the population models to come out with a more accurate one. Having said that, this research study helps to analyze and compare the two population models under consideration - exponential and logistic growth models, thereby identifying the most effective one. Using the census data of 2011, the approximate population for 2016 to 2031 are calculated for 20 Indian states using both the models, compared and recorded the data with the actual population. On comparing the results of both models, it is found that logistic population model is more accurate than the exponential model, and using this model, we can predict the future population in a more effective way. This will give an insight to the researchers about the effective models of population and how effective these population models are in predicting the future population.Keywords: population growth, population models, exponential model, logistic model, fibonacci model, lotka-volterra model, future population prediction, demographers
Procedia PDF Downloads 1226470 A Finite Element/Finite Volume Method for Dam-Break Flows over Deformable Beds
Authors: Alia Alghosoun, Ashraf Osman, Mohammed Seaid
Abstract:
A coupled two-layer finite volume/finite element method was proposed for solving dam-break flow problem over deformable beds. The governing equations consist of the well-balanced two-layer shallow water equations for the water flow and a linear elastic model for the bed deformations. Deformations in the topography can be caused by a brutal localized force or simply by a class of sliding displacements on the bathymetry. This deformation in the bed is a source of perturbations, on the water surface generating water waves which propagate with different amplitudes and frequencies. Coupling conditions at the interface are also investigated in the current study and two mesh procedure is proposed for the transfer of information through the interface. In the present work a new procedure is implemented at the soil-water interface using the finite element and two-layer finite volume meshes with a conservative distribution of the forces at their intersections. The finite element method employs quadratic elements in an unstructured triangular mesh and the finite volume method uses the Rusanove to reconstruct the numerical fluxes. The numerical coupled method is highly efficient, accurate, well balanced, and it can handle complex geometries as well as rapidly varying flows. Numerical results are presented for several test examples of dam-break flows over deformable beds. Mesh convergence study is performed for both methods, the overall model provides new insight into the problems at minimal computational cost.Keywords: dam-break flows, deformable beds, finite element method, finite volume method, hybrid techniques, linear elasticity, shallow water equations
Procedia PDF Downloads 1786469 Childhood Apraxia of Speech and Autism: Interaction Influences and Treatment
Authors: Elad Vashdi
Abstract:
It is common to find speech deficit among children diagnosed with Autism. It can be found in the clinical field and recently in research. One of the DSM-V criteria suggests a speech delay (Delay in, or total lack of, the development of spoken language), but doesn't explain the cause of it. A common perception among professionals and families is that the inability to talk results from the autism. Autism is a name for a syndrome which just describes a phenomenon and is defined behaviorally. Since it is not based yet on a physiological gold standard, one can not conclude the nature of a deficit based on the name of the syndrome. A wide retrospective research (n=270) which included children with motor speech difficulties was conducted in Israel. The study analyzed entry evaluations in a private clinic during the years 2006-2013. The data was extracted from the reports. High percentage of children diagnosed with Autism (60%) was found. This result demonstrates the high relationship between Autism and motor speech problem. It also supports recent findings in research of Childhood apraxia of speech (CAS) occurrence among children with ASD. Only small percentage of the participants in this research (10%) were diagnosed with CAS even though their verbal deficits well fitted the guidelines for CAS diagnosis set by ASHA in 2007. This fact raises questions regarding the diagnostic procedure in Israel. The understanding that CAS might highly exist within Autism and can have a remarkable influence on the course of early development should be a guiding tool within the diagnosis procedure. CAS can explain the nature of the speech problem among some of the autistic children and guide the treatment in a more accurate way. Calculating the prevalence of CAS which includes the comorbidity with ASD reveals new numbers and suggests treating differently the CAS population.Keywords: childhood apraxia of speech, Autism, treatment, speech
Procedia PDF Downloads 2746468 Application of Grey Theory in the Forecast of Facility Maintenance Hours for Office Building Tenants and Public Areas
Authors: Yen Chia-Ju, Cheng Ding-Ruei
Abstract:
This study took case office building as subject and explored the responsive work order repair request of facilities and equipment in offices and public areas by gray theory, with the purpose of providing for future related office building owners, executive managers, property management companies, mechanical and electrical companies as reference for deciding and assessing forecast model. Important conclusions of this study are summarized as follows according to the study findings: 1. Grey Relational Analysis discusses the importance of facilities repair number of six categories, namely, power systems, building systems, water systems, air conditioning systems, fire systems and manpower dispatch in order. In terms of facilities maintenance importance are power systems, building systems, water systems, air conditioning systems, manpower dispatch and fire systems in order. 2. GM (1,N) and regression method took maintenance hours as dependent variables and repair number, leased area and tenants number as independent variables and conducted single month forecast based on 12 data from January to December 2011. The mean absolute error and average accuracy of GM (1,N) from verification results were 6.41% and 93.59%; the mean absolute error and average accuracy of regression model were 4.66% and 95.34%, indicating that they have highly accurate forecast capability.Keywords: rey theory, forecast model, Taipei 101, office buildings, property management, facilities, equipment
Procedia PDF Downloads 4436467 Effects of the Food Colour Erythrosine on Thyroid Gland Function in Experimental Rats
Authors: Maha M.Saber, Eitedal Daoud, Moetazza M. Alshafei, Lobna M. Abd El-Latif
Abstract:
Children in the third world consumes many food products colored red like sweets and soft drink without knowing its effect on health or the type of color used in these products Erythrosine (ER,FD & C Red No.3) is one of the most common coloring agent used in these products and in coloring cherry in compotes. The possible adverse effect of erythrosine ER on the thyroid gland function is investigated in albino rats. Forty-five adult male albino rats were divided to three groups two groups will receive ER orally in doses 68 and I36mg/kg respectively. Third group will receive distilled water for three months Sections of thyroid glands were examined for histopathological, morphometric analysis and MIB-I Ki67 (proliferative marker). Serum concentration of triiodothyronine (T3), Thyroxin (T4) and thyrotrophin (TSH) were determined, results showed histological changes in the two treatment groups versus control group in the group with 68mg/kg dose show vaculation of the cytoplasm of follicular cells and pleomorphism of their nuclei. While the other treated group {136mg /kg} showed congestion of blood vessels, hyperplasia of the interstitial cells and increased multilayer of the follicular cells. Highly significant increase in the mean area of the thyroid follicles in both treated groups compared to control group.Erythrosine treated groups showed a very highly significant decrease (P < 0.001) in serum concentration of T3 and T 4 while TSH showed a very highly significant increase versus control.Keywords: erythrosine, thyroid, morphometrics, proliferative marker
Procedia PDF Downloads 4056466 Video Text Information Detection and Localization in Lecture Videos Using Moments
Authors: Belkacem Soundes, Guezouli Larbi
Abstract:
This paper presents a robust and accurate method for text detection and localization over lecture videos. Frame regions are classified into text or background based on visual feature analysis. However, lecture video shows significant degradation mainly related to acquisition conditions, camera motion and environmental changes resulting in low quality videos. Hence, affecting feature extraction and description efficiency. Moreover, traditional text detection methods cannot be directly applied to lecture videos. Therefore, robust feature extraction methods dedicated to this specific video genre are required for robust and accurate text detection and extraction. Method consists of a three-step process: Slide region detection and segmentation; Feature extraction and non-text filtering. For robust and effective features extraction moment functions are used. Two distinct types of moments are used: orthogonal and non-orthogonal. For orthogonal Zernike Moments, both Pseudo Zernike moments are used, whereas for non-orthogonal ones Hu moments are used. Expressivity and description efficiency are given and discussed. Proposed approach shows that in general, orthogonal moments show high accuracy in comparison to the non-orthogonal one. Pseudo Zernike moments are more effective than Zernike with better computation time.Keywords: text detection, text localization, lecture videos, pseudo zernike moments
Procedia PDF Downloads 1496465 Combining Patients Pain Scores Reports with Functionality Scales in Chronic Low Back Pain Patients
Authors: Ivana Knezevic, Kenneth D. Candido, N. Nick Knezevic
Abstract:
Background: While pain intensity scales remain generally accepted assessment tool, and the numeric pain rating score is highly subjective, we nevertheless rely on them to make a judgment about treatment effects. Misinterpretation of pain can lead practitioners to underestimate or overestimate the patient’s medical condition. The purpose of this study was to analyze how the numeric rating pain scores given by patients with low back pain correlate with their functional activity levels. Methods: We included 100 consecutive patients with radicular low back pain (LBP) after the Institutional Review Board (IRB) approval. Pain scores, numeric rating scale (NRS) responses at rest and in the movement,Oswestry Disability Index (ODI) questionnaire answers were collected 10 times through 12 months. The ODI questionnaire is targeting a patient’s activities and physical limitations as well as a patient’s ability to manage stationary everyday duties. Statistical analysis was performed by using SPSS Software version 20. Results: The average duration of LBP was 14±22 months at the beginning of the study. All patients included in the study were between 24 and 78 years old (average 48.85±14); 56% women and 44% men. Differences between ODI and pain scores in the range from -10% to +10% were considered “normal”. Discrepancies in pain scores were graded as mild between -30% and -11% or +11% and +30%; moderate between -50% and -31% and +31% and +50% and severe if differences were more than -50% or +50%. Our data showed that pain scores at rest correlate well with ODI in 65% of patients. In 30% of patients mild discrepancies were present (negative in 21% and positive in 9%), 4% of patients had moderate and 1% severe discrepancies. “Negative discrepancy” means that patients graded their pain scores much higher than their functional ability, and most likely exaggerated their pain. “Positive discrepancy” means that patients graded their pain scores much lower than their functional ability, and most likely underrated their pain. Comparisons between ODI and pain scores during movement showed normal correlation in only 39% of patients. Mild discrepancies were present in 42% (negative in 39% and positive in 3%); moderate in 14% (all negative), and severe in 5% (all negative) of patients. A 58% unknowingly exaggerated their pain during movement. Inconsistencies were equal in male and female patients (p=0.606 and p=0.928).Our results showed that there was a negative correlation between patients’ satisfaction and the degree of reporting pain inconsistency. Furthermore, patients talking opioids showed more discrepancies in reporting pain intensity scores than did patients taking non-opioid analgesics or not taking medications for LBP (p=0.038). There was a highly statistically significant correlation between morphine equivalents doses and the level of discrepancy (p<0.0001). Conclusion: We have put emphasis on the patient education in pain evaluation as a vital step in accurate pain level reporting. We have showed a direct correlation with patients’ satisfaction. Furthermore, we must identify other parameters in defining our patients’ chronic pain conditions, such as functionality scales, quality of life questionnaires, etc., and should move away from an overly simplistic subjective rating scale.Keywords: pain score, functionality scales, low back pain, lumbar
Procedia PDF Downloads 2336464 Potentials for Learning History through Role-Playing in Virtual Reality: An Exploratory Study on Role-Playing on a Virtual Heritage Site
Authors: Danzhao Cheng, Eugene Ch'ng
Abstract:
Virtual Reality technologies can reconstruct cultural heritage objects and sites to a level of realism. Concentrating mostly on documenting authentic data and accurate representations of tangible contents, current virtual heritage is limited to accumulating visually presented objects. Such constructions, however, are fragmentary and may not convey the inherent significance of heritage in a meaningful way. In order to contextualise fragmentary historical contents where history can be told, a strategy is to create a guided narrative via role-playing. Such an approach can strengthen the logical connections of cultural elements and facilitate creative synthesis within the virtual world. This project successfully reconstructed the Ningbo Sanjiangkou VR site in Yuan Dynasty combining VR technology and role-play game approach. The results with 80 pairs of participants suggest that VR role-playing can be beneficial in a number of ways. Firstly, it creates thematic interactivity which encourages users to explore the virtual heritage in a more entertaining way with task-oriented goals. Secondly, the experience becomes highly engaging since users can interpret a historical context through the perspective of specific roles that exist in past societies. Thirdly, personalisation allows open-ended sequences of the expedition, reinforcing user’s acquisition of procedural knowledge relative to the cultural domain. To sum up, role-playing in VR poses great potential for experiential learning as it allows users to interpret a historical context in a more entertaining way.Keywords: experiential learning, maritime silk road, role-playing, virtual heritage, virtual reality
Procedia PDF Downloads 1636463 Modeling of Tool Flank Wear in Finish Hard Turning of AISI D2 Using Genetic Programming
Authors: V. Pourmostaghimi, M. Zadshakoyan
Abstract:
Efficiency and productivity of the finish hard turning can be enhanced impressively by utilizing accurate predictive models for cutting tool wear. However, the ability of genetic programming in presenting an accurate analytical model is a notable characteristic which makes it more applicable than other predictive modeling methods. In this paper, the genetic equation for modeling of tool flank wear is developed with the use of the experimentally measured flank wear values and genetic programming during finish turning of hardened AISI D2. Series of tests were conducted over a range of cutting parameters and the values of tool flank wear were measured. On the basis of obtained results, genetic model presenting connection between cutting parameters and tool flank wear were extracted. The accuracy of the genetically obtained model was assessed by using two statistical measures, which were root mean square error (RMSE) and coefficient of determination (R²). Evaluation results revealed that presented genetic model predicted flank wear over the study area accurately (R² = 0.9902 and RMSE = 0.0102). These results allow concluding that the proposed genetic equation corresponds well with experimental data and can be implemented in real industrial applications.Keywords: cutting parameters, flank wear, genetic programming, hard turning
Procedia PDF Downloads 1766462 Trading off Accuracy for Speed in Powerdrill
Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica
Abstract:
In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries
Procedia PDF Downloads 2596461 An Electrically Small Silver Ink Printed FR4 Antenna for RF Transceiver Chip CC1101
Authors: F. Majeed, D. V. Thiel, M. Shahpari
Abstract:
An electrically small meander line antenna is designed for impedance matching with RF transceiver chip CC1101. The design provides the flexibility of tuning the reactance of the antenna over a wide range of values: highly capacitive to highly inductive. The antenna was printed with silver ink on FR4 substrate using the screen printing design process. The antenna impedance was perfectly matched to CC1101 at 433 MHz. The measured radiation efficiency of the antenna was 81.3% at resonance. The 3 dB and 10 dB fractional bandwidth of the antenna was 14.5% and 4.78%, respectively. The read range of the antenna was compared with a copper wire monopole antenna over a distance of five meters. The antenna, with a perfect impedance match with RF transceiver chip CC1101, shows improvement in the read range compared to a monopole antenna over the specified distance.Keywords: meander line antenna, RFID, silver ink printing, impedance matching
Procedia PDF Downloads 2736460 [Keynote Talk]: Mathematical and Numerical Modelling of the Cardiovascular System: Macroscale, Mesoscale and Microscale Applications
Authors: Aymen Laadhari
Abstract:
The cardiovascular system is centered on the heart and is characterized by a very complex structure with different physical scales in space (e.g. micrometers for erythrocytes and centimeters for organs) and time (e.g. milliseconds for human brain activity and several years for development of some pathologies). The development and numerical implementation of mathematical models of the cardiovascular system is a tremendously challenging topic at the theoretical and computational levels, inducing consequently a growing interest over the past decade. The accurate computational investigations in both healthy and pathological cases of processes related to the functioning of the human cardiovascular system can be of great potential in tackling several problems of clinical relevance and in improving the diagnosis of specific diseases. In this talk, we focus on the specific task of simulating three particular phenomena related to the cardiovascular system on the macroscopic, mesoscopic and microscopic scales, respectively. Namely, we develop numerical methodologies tailored for the simulation of (i) the haemodynamics (i.e., fluid mechanics of blood) in the aorta and sinus of Valsalva interacting with highly deformable thin leaflets, (ii) the hyperelastic anisotropic behaviour of cardiomyocytes and the influence of calcium concentrations on the contraction of single cells, and (iii) the dynamics of red blood cells in microvasculature. For each problem, we present an appropriate fully Eulerian finite element methodology. We report several numerical examples to address in detail the relevance of the mathematical models in terms of physiological meaning and to illustrate the accuracy and efficiency of the numerical methods.Keywords: finite element method, cardiovascular system, Eulerian framework, haemodynamics, heart valve, cardiomyocyte, red blood cell
Procedia PDF Downloads 2516459 Signature Verification System for a Banking Business Process Management
Authors: A. Rahaf, S. Liyakathunsia
Abstract:
In today’s world, unprecedented operational pressure is faced by banks that test the efficiency, effectiveness, and agility of their business processes. In a typical banking process, a person’s authorization is usually based on his signature on most all of the transactions. Signature verification is considered as one of the highly significant information needed for any bank document processing. Banks usually use Signature Verification to authenticate the identity of individuals. In this paper, a business process model has been proposed in order to increase the quality of the verification process and to reduce time and needed resources. In order to understand the current process, a survey has been conducted and distributed among bank employees. After analyzing the survey, a process model has been created using Bizagi modeler which helps in simulating the process after assigning time and cost of it. The outcomes show that the automation of signature verification process is highly recommended for a banking business process.Keywords: business process management, process modeling, quality, Signature Verification
Procedia PDF Downloads 4216458 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand
Authors: Manit Pollar
Abstract:
Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.Keywords: SARIMA, time series model, dengue cases, Thailand
Procedia PDF Downloads 3566457 Driver Take-Over Time When Resuming Control from Highly Automated Driving in Truck Platooning Scenarios
Authors: Bo Zhang, Ellen S. Wilschut, Dehlia M. C. Willemsen, Marieke H. Martens
Abstract:
With the rapid development of intelligent transportation systems, automated platooning of trucks is drawing increasing interest for its beneficial effects on safety, energy consumption and traffic flow efficiency. Nevertheless, one major challenge lies in the safe transition of control from the automated system back to the human drivers, especially when they have been inattentive after a long period of highly automated driving. In this study, we investigated driver take-over time after a system initiated request to leave the platooning system Virtual Tow Bar in a non-critical scenario. 22 professional truck drivers participated in the truck driving simulator experiment, and each was instructed to drive under three experimental conditions before the presentation of the take-over request (TOR): driver ready (drivers were instructed to monitor the road constantly), driver not-ready (drivers were provided with a tablet) and eye-shut. The results showed significantly longer take-over time in both driver not-ready and eye-shut conditions compared with the driver ready condition. Further analysis revealed hand movement time as the main factor causing long response time in the driver not-ready condition, while in the eye-shut condition, gaze reaction time also influenced the total take-over time largely. In addition to comparing the means, large individual differences can be found especially in two driver, not attentive conditions. The importance of a personalized driver readiness predictor for a safe transition is concluded.Keywords: driving simulation, highly automated driving, take-over time, transition of control, truck platooning
Procedia PDF Downloads 2516456 Value Chain Analysis of the Seabass Industry in Doumen
Authors: Tiantian Ma
Abstract:
The district of Doumen, Zhuhai has a sophisticated seabass value chain. However, unlike typical Global Value Chain (GVC) industries, the seabass value chain in Doumen is highly domestic both in terms of production and consumption. Still, since the highly-industrialized and capital-intensive industry involves many off-farm segments in both upstream and downstream, this paper will be utilizing the method of value chain analysis. To be specific, the paper will concentrate on two research goals: 1) the value chain mapping of the seabass industry, such as identifying actors in the hatchery, fish feed, fishponds, processing, logistics, and distribution, 2) the SWOT analysis of the seabass industry in Doumen, including incompetence of the waste disposal, the strategy of marketing, and the supportive role of the government, etc. In general, the seabass industry in Doumen is a sophisticated but not yet comprehensive value chain. It has achieved a lot in industrializing aqua-food products and fostering development, but there are still improvements that could be carried out, such as upholding environmental sustainability and promoting the brand better.Keywords: agricultural value chain, fish farming, regional development, SWOT analysis, value chain mapping
Procedia PDF Downloads 1506455 A Geographic Information System Mapping Method for Creating Improved Satellite Solar Radiation Dataset Over Qatar
Authors: Sachin Jain, Daniel Perez-Astudillo, Dunia A. Bachour, Antonio P. Sanfilippo
Abstract:
The future of solar energy in Qatar is evolving steadily. Hence, high-quality spatial solar radiation data is of the uttermost requirement for any planning and commissioning of solar technology. Generally, two types of solar radiation data are available: satellite data and ground observations. Satellite solar radiation data is developed by the physical and statistical model. Ground data is collected by solar radiation measurement stations. The ground data is of high quality. However, they are limited to distributed point locations with the high cost of installation and maintenance for the ground stations. On the other hand, satellite solar radiation data is continuous and available throughout geographical locations, but they are relatively less accurate than ground data. To utilize the advantage of both data, a product has been developed here which provides spatial continuity and higher accuracy than any of the data alone. The popular satellite databases: National Solar radiation Data Base, NSRDB (PSM V3 model, spatial resolution: 4 km) is chosen here for merging with ground-measured solar radiation measurement in Qatar. The spatial distribution of ground solar radiation measurement stations is comprehensive in Qatar, with a network of 13 ground stations. The monthly average of the daily total Global Horizontal Irradiation (GHI) component from ground and satellite data is used for error analysis. The normalized root means square error (NRMSE) values of 3.31%, 6.53%, and 6.63% for October, November, and December 2019 were observed respectively when comparing in-situ and NSRDB data. The method is based on the Empirical Bayesian Kriging Regression Prediction model available in ArcGIS, ESRI. The workflow of the algorithm is based on the combination of regression and kriging methods. A regression model (OLS, ordinary least square) is fitted between the ground and NSBRD data points. A semi-variogram is fitted into the experimental semi-variogram obtained from the residuals. The kriging residuals obtained after fitting the semi-variogram model were added to NSRBD data predicted values obtained from the regression model to obtain the final predicted values. The NRMSE values obtained after merging are respectively 1.84%, 1.28%, and 1.81% for October, November, and December 2019. One more explanatory variable, that is the ground elevation, has been incorporated in the regression and kriging methods to reduce the error and to provide higher spatial resolution (30 m). The final GHI maps have been created after merging, and NRMSE values of 1.24%, 1.28%, and 1.28% have been observed for October, November, and December 2019, respectively. The proposed merging method has proven as a highly accurate method. An additional method is also proposed here to generate calibrated maps by using regression and kriging model and further to use the calibrated model to generate solar radiation maps from the explanatory variable only when not enough historical ground data is available for long-term analysis. The NRMSE values obtained after the comparison of the calibrated maps with ground data are 5.60% and 5.31% for November and December 2019 month respectively.Keywords: global horizontal irradiation, GIS, empirical bayesian kriging regression prediction, NSRDB
Procedia PDF Downloads 886454 An Implementation of Fuzzy Logic Technique for Prediction of the Power Transformer Faults
Authors: Omar M. Elmabrouk., Roaa Y. Taha., Najat M. Ebrahim, Sabbreen A. Mohammed
Abstract:
Power transformers are the most crucial part of power electrical system, distribution and transmission grid. This part is maintained using predictive or condition-based maintenance approach. The diagnosis of power transformer condition is performed based on Dissolved Gas Analysis (DGA). There are five main methods utilized for analyzing these gases. These methods are International Electrotechnical Commission (IEC) gas ratio, Key Gas, Roger gas ratio, Doernenburg, and Duval Triangle. Moreover, due to the importance of the transformers, there is a need for an accurate technique to diagnose and hence predict the transformer condition. The main objective of this technique is to avoid the transformer faults and hence to maintain the power electrical system, distribution and transmission grid. In this paper, the DGA was utilized based on the data collected from the transformer records available in the General Electricity Company of Libya (GECOL) which is located in Benghazi-Libya. The Fuzzy Logic (FL) technique was implemented as a diagnostic approach based on IEC gas ratio method. The FL technique gave better results and approved to be used as an accurate prediction technique for power transformer faults. Also, this technique is approved to be a quite interesting for the readers and the concern researchers in the area of FL mathematics and power transformer.Keywords: dissolved gas-in-oil analysis, fuzzy logic, power transformer, prediction
Procedia PDF Downloads 1426453 Development of Bioactive Medical Textiles by Immobilizing Nanoparticles at Cotton Fabric
Authors: Munir Ashraf, Shagufta Riaz
Abstract:
Personal protective equipment (PPE) and bioactive textiles are highly important for the health care of front line hospital workers, patients, and the general population to be safe from highly infectious diseases. This was even more critical in the wake of COVID-19 outbreak. Most of the medical textiles are inactive against various viruses and bacteria, hence there is a need to wash them frequently to avoid the spread of microorganisms. According to survey conducted by the world health organization, more than 500 million people get infected from hospitals, and more than 13 million died due to these hospitals’ acquired deadly diseases. The market available PPE are though effective against the penetration of pathogens and to kill bacteria but, they are not breathable and active against different viruses. Therefore, there was a great need to develop textiles that are not only effective against bacteria, fungi, and viruses but also are comfortable to the medical personnel and patients. In the present study, waterproof breathable, and biologically active textiles were developed using antiviral and antibacterial nanomaterials. These nanomaterials like TiO₂, ZnO, Cu, and Ag were immobilized at the surface of cotton fabric by using different silane coupling agents and electroless deposition that they retained their functionality even after 30 industrial laundering cycles. Afterwards, the treated fabrics were coated with a waterproof breathable film to prevent the permeation of liquid droplets, any particle or microorganisms greater than 80 nm. The developed cotton fabric was highly active against bacteria and viruses. The good durability of nanomaterials at the cotton surface after several industrial washing cycles makes this fabric an ideal candidate for bioactive textiles used in the medical field.Keywords: antibacterial, antiviral, cotton, durable
Procedia PDF Downloads 1776452 Multi-Agent System Based Solution for Operating Agile and Customizable Micro Manufacturing Systems
Authors: Dylan Santos De Pinho, Arnaud Gay De Combes, Matthieu Steuhlet, Claude Jeannerat, Nabil Ouerhani
Abstract:
The Industry 4.0 initiative has been launched to address huge challenges related to ever-smaller batch sizes. The end-user need for highly customized products requires highly adaptive production systems in order to keep the same efficiency of shop floors. Most of the classical Software solutions that operate the manufacturing processes in a shop floor are based on rigid Manufacturing Execution Systems (MES), which are not capable to adapt the production order on the fly depending on changing demands and or conditions. In this paper, we present a highly modular and flexible solution to orchestrate a set of production systems composed of a micro-milling machine-tool, a polishing station, a cleaning station, a part inspection station, and a rough material store. The different stations are installed according to a novel matrix configuration of a 3x3 vertical shelf. The different cells of the shelf are connected through horizontal and vertical rails on which a set of shuttles circulate to transport the machined parts from a station to another. Our software solution for orchestrating the tasks of each station is based on a Multi-Agent System. Each station and each shuttle is operated by an autonomous agent. All agents communicate with a central agent that holds all the information about the manufacturing order. The core innovation of this paper lies in the path planning of the different shuttles with two major objectives: 1) reduce the waiting time of stations and thus reduce the cycle time of the entire part, and 2) reduce the disturbances like vibration generated by the shuttles, which highly impacts the manufacturing process and thus the quality of the final part. Simulation results show that the cycle time of the parts is reduced by up to 50% compared with MES operated linear production lines while the disturbance is systematically avoided for the critical stations like the milling machine-tool.Keywords: multi-agent systems, micro-manufacturing, flexible manufacturing, transfer systems
Procedia PDF Downloads 1286451 Synthesis of Highly Valuable Fuel Fractions from Waste Date Seeds Oil
Authors: Farrukh Jamil, Ala'A H. Al-Muhtaseb, Lamya Al-Haj, Mohab A. Al-Hinai
Abstract:
Environmental problems and the security of energy supply have motivated the attention in the expansion of alternatives for fossil based fuels. Biomass has been recognized as a capable resource because it is plentifully available and in principle carbon dioxide neutral. Present study focuses on utilization date seeds oil for synthesizing high value fuels formulations such as green diesel and jet fuel. The hydrodeoxygenation of date seeds oil occurred to be highly efficient at following operating conditions temperature 300°C pressure 10bar with continuous stirring at 500 rpm. Products characterization revealed the efficiency of hydrodeoxygenation by formation of linear hydrocarbons (paraffin) in larger fraction. Based on the type of components in product oil it was calculated that maximum fraction lies within the range of green diesel 72.78 % then jet fuel 28.25 % by using Pt/C catalyst. It can be concluded that waste date seeds oil has potential to be used for obtaining high value products.Keywords: date seeds, hydrodeoxygenation, paraffin, deoxygenation
Procedia PDF Downloads 2636450 A Deep Learning Approach to Calculate Cardiothoracic Ratio From Chest Radiographs
Authors: Pranav Ajmera, Amit Kharat, Tanveer Gupte, Richa Pant, Viraj Kulkarni, Vinay Duddalwar, Purnachandra Lamghare
Abstract:
The cardiothoracic ratio (CTR) is the ratio of the diameter of the heart to the diameter of the thorax. An abnormal CTR, that is, a value greater than 0.55, is often an indicator of an underlying pathological condition. The accurate prediction of an abnormal CTR from chest X-rays (CXRs) aids in the early diagnosis of clinical conditions. We propose a deep learning-based model for automatic CTR calculation that can assist the radiologist with the diagnosis of cardiomegaly and optimize the radiology flow. The study population included 1012 posteroanterior (PA) CXRs from a single institution. The Attention U-Net deep learning (DL) architecture was used for the automatic calculation of CTR. A CTR of 0.55 was used as a cut-off to categorize the condition as cardiomegaly present or absent. An observer performance test was conducted to assess the radiologist's performance in diagnosing cardiomegaly with and without artificial intelligence (AI) assistance. The Attention U-Net model was highly specific in calculating the CTR. The model exhibited a sensitivity of 0.80 [95% CI: 0.75, 0.85], precision of 0.99 [95% CI: 0.98, 1], and a F1 score of 0.88 [95% CI: 0.85, 0.91]. During the analysis, we observed that 51 out of 1012 samples were misclassified by the model when compared to annotations made by the expert radiologist. We further observed that the sensitivity of the reviewing radiologist in identifying cardiomegaly increased from 40.50% to 88.4% when aided by the AI-generated CTR. Our segmentation-based AI model demonstrated high specificity and sensitivity for CTR calculation. The performance of the radiologist on the observer performance test improved significantly with AI assistance. A DL-based segmentation model for rapid quantification of CTR can therefore have significant potential to be used in clinical workflows.Keywords: cardiomegaly, deep learning, chest radiograph, artificial intelligence, cardiothoracic ratio
Procedia PDF Downloads 966449 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter
Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi
Abstract:
In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm
Procedia PDF Downloads 3856448 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table
Authors: David A. Swanson, Lucky M. Tedrow
Abstract:
Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population
Procedia PDF Downloads 3286447 Designing of Tooling Solution for Material Handling in Highly Automated Manufacturing System
Authors: Muhammad Umair, Yuri Nikolaev, Denis Artemov, Ighor Uzhinsky
Abstract:
A flexible manufacturing system is an integral part of a smart factory of industry 4.0 in which every machine is interconnected and works autonomously. Robots are in the process of replacing humans in every industrial sector. As the cyber-physical-system (CPS) and artificial intelligence (AI) are advancing, the manufacturing industry is getting more dependent on computers than human brains. This modernization has boosted the production with high quality and accuracy and shifted from classic production to smart manufacturing systems. However, material handling for such automated productions is a challenge and needs to be addressed with the best possible solution. Conventional clamping systems are designed for manual work and not suitable for highly automated production systems. Researchers and engineers are trying to find the most economical solution for loading/unloading and transportation workpieces from a warehouse to a machine shop for machining operations and back to the warehouse without human involvement. This work aims to propose an advanced multi-shape tooling solution for highly automated manufacturing systems. The currently obtained result shows that it could function well with automated guided vehicles (AGVs) and modern conveyor belts. The proposed solution is following requirements to be automation-friendly, universal for different part geometry and production operations. We used a bottom-up approach in this work, starting with studying different case scenarios and their limitations and finishing with the general solution.Keywords: artificial intelligence, cyber physics system, Industry 4.0, material handling, smart factory, flexible manufacturing system
Procedia PDF Downloads 1286446 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building
Authors: Abdul Hakim Chikho
Abstract:
Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.Keywords: earthquake, fundamental mode period, design, building
Procedia PDF Downloads 2816445 Medical Image Watermark and Tamper Detection Using Constant Correlation Spread Spectrum Watermarking
Authors: Peter U. Eze, P. Udaya, Robin J. Evans
Abstract:
Data hiding can be achieved by Steganography or invisible digital watermarking. For digital watermarking, both accurate retrieval of the embedded watermark and the integrity of the cover image are important. Medical image security in Teleradiology is one of the applications where the embedded patient record needs to be extracted with accuracy as well as the medical image integrity verified. In this research paper, the Constant Correlation Spread Spectrum digital watermarking for medical image tamper detection and accurate embedded watermark retrieval is introduced. In the proposed method, a watermark bit from a patient record is spread in a medical image sub-block such that the correlation of all watermarked sub-blocks with a spreading code, W, would have a constant value, p. The constant correlation p, spreading code, W and the size of the sub-blocks constitute the secret key. Tamper detection is achieved by flagging any sub-block whose correlation value deviates by more than a small value, ℇ, from p. The major features of our new scheme include: (1) Improving watermark detection accuracy for high-pixel depth medical images by reducing the Bit Error Rate (BER) to Zero and (2) block-level tamper detection in a single computational process with simultaneous watermark detection, thereby increasing utility with the same computational cost.Keywords: Constant Correlation, Medical Image, Spread Spectrum, Tamper Detection, Watermarking
Procedia PDF Downloads 192