Search results for: real time pest tracking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21495

Search results for: real time pest tracking

17835 Process Safety Management Digitalization via SHEQTool based on Occupational Safety and Health Administration and Center for Chemical Process Safety, a Case Study in Petrochemical Companies

Authors: Saeed Nazari, Masoom Nazari, Ali Hejazi, Siamak Sanoobari Ghazi Jahani, Mohammad Dehghani, Javad Vakili

Abstract:

More than ever, digitization is an imperative for businesses to keep their competitive advantages, foster innovation and reduce paperwork. To design and successfully implement digital transformation initiatives within process safety management system, employees need to be equipped with the right tool, frameworks, and best practices. we developed a unique full stack application so-called SHEQTool which is entirely dynamic based on our extensive expertise, experience, and client feedback to help business processes particularly operations safety management. We use our best knowledge and scientific methodologies published by CCPS and OSHA Guidelines to streamline operations and integrated them into task management within Petrochemical Companies. We digitalize their main process safety management system elements and their sub elements such as hazard identification and risk management, training and communication, inspection and audit, critical changes management, contractor management, permit to work, pre-start-up safety review, incident reporting and investigation, emergency response plan, personal protective equipment, occupational health, and action management in a fully customizable manner with no programming needs for users. We review the feedback from main actors within petrochemical plant which highlights improving their business performance and productivity as well as keep tracking their functions’ key performance indicators (KPIs) because it; 1) saves time, resources, and costs of all paperwork on our businesses (by Digitalization); 2) reduces errors and improve performance within management system by covering most of daily software needs of the organization and reduce complexity and associated costs of numerous tools and their required training (One Tool Approach); 3) focuses on management systems and integrate functions and put them into traceable task management (RASCI and Flowcharting); 4) helps the entire enterprise be resilient to any change of your processes, technologies, assets with minimum costs (through Organizational Resilience); 5) reduces significantly incidents and errors via world class safety management programs and elements (by Simplification); 6) gives the companies a systematic, traceable, risk based, process based, and science based integrated management system (via proper Methodologies); 7) helps business processes complies with ISO 9001, ISO 14001, ISO 45001, ISO 31000, best practices as well as legal regulations by PDCA approach (Compliance).

Keywords: process, safety, digitalization, management, risk, incident, SHEQTool, OSHA, CCPS

Procedia PDF Downloads 66
17834 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring

Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie

Abstract:

Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.

Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement

Procedia PDF Downloads 11
17833 Short Life Cycle Time Series Forecasting

Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar

Abstract:

The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.

Keywords: forecast, short life cycle product, structured judgement, time series

Procedia PDF Downloads 358
17832 A Method to Estimate Wheat Yield Using Landsat Data

Authors: Zama Mahmood

Abstract:

The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation.

Keywords: landsat, NDVI, remote sensing, satellite images, yield

Procedia PDF Downloads 335
17831 Deposition of Size Segregated Particulate Matter in Human Respiratory Tract and Their Health Effects in Glass City Residents

Authors: Kalpana Rajouriya, Ajay Taneja

Abstract:

Particulates are ubiquitous in the air environment and cause serious threats to human beings, such as lung cancer, COPD, and Asthma. Particulates mainly arise from industrial effluent, vehicular emission, and other anthropogenic activities. In the glass industrial city Firozabad, real-time monitoring of size segregated Particulate Matter (PM) and black carbon was done by Aerosol Black Carbon Detector (ABCD) and GRIMM portable aerosol Spectrometer at two different sites in which one site is urban and another is rural. The average mass concentration of size segregated PM during the study period (March & April 2022) was recorded as PM10 (223.73 g/m⁻³), PM5.0 (44.955 g/m⁻³), PM2.5 (59.275 g/m⁻³), PM1.0 (33.02 g/m⁻³), PM0.5 (2.05 g/m⁻³), and PM0.25 (2.99 g/m⁻³). The highest concentration of BC was found in Urban due to the emissions from diesel engines and wood burning, while NO2 was highest at the rural sites. The average concentrations of PM10 (6.08 and 2.73 times) PM2.5 exceeded the NAAQS and WHO guidelines. Particulate Matter deposition and health risk assessment was done by MPPD and USEPA model to know about the particulate matter toxicity in industrial residents. Health risk assessment results showed that Children are most likely to be affected by exposure of PM10 and PM2.5 and may have various non-carcinogenic and carcinogenic diseases. Deposition results inferred that the sensitive exposed population, especially 9 years old children, have high PM deposition as well as visualization and may be at risk of developing health-related problems from exposure to size-segregated PM. They will be discussed during presentation.

Keywords: particulate matter, black carbon, NO2, deposition of PM, health risk

Procedia PDF Downloads 66
17830 Online Authenticity Verification of a Biometric Signature Using Dynamic Time Warping Method and Neural Networks

Authors: Gałka Aleksandra, Jelińska Justyna, Masiak Albert, Walentukiewicz Krzysztof

Abstract:

An offline signature is well-known however not the safest way to verify identity. Nowadays, to ensure proper authentication, i.e. in banking systems, multimodal verification is more widely used. In this paper the online signature analysis based on dynamic time warping (DTW) coupled with machine learning approaches has been presented. In our research signatures made with biometric pens were gathered. Signature features as well as their forgeries have been described. For verification of authenticity various methods were used including convolutional neural networks using DTW matrix and multilayer perceptron using sums of DTW matrix paths. System efficiency has been evaluated on signatures and signature forgeries collected on the same day. Results are presented and discussed in this paper.

Keywords: dynamic time warping, handwritten signature verification, feature-based recognition, online signature

Procedia PDF Downloads 175
17829 Determining Factors Influencing the Total Funding in Islamic Banking of Indonesia

Authors: Euphrasia Susy Suhendra, Lies Handrijaningsih

Abstract:

The banking sector as an intermediary party or intermediaries occupies a very important position in bridging the needs of working capital investment in the real sector with funds owner. This will certainly make money more effectively to improve the economic value added. As an intermediary, Islamic banks raise funds from the public and then distribute in the form of financing. In practice, the distribution of funding that is run by Islamic Banking is not as easy as, in theory, because, in fact, there are many financing problems; some are caused by lacking the assessment and supervision of banks to customers. This study aims to analyze the influence of the Third Party Funds, Return on Assets (ROA), Non Performing Financing (NPF), and Financing Deposit Ratio (FDR) to Total Financing provided to the Community by Islamic Banks in Indonesia. The data used is monthly data released by Bank of Indonesia in Islamic Banking Statistics in the time period of January 2009 - December 2013. This study uses cointegration test to see the long-term relationship, and use error correction models to examine the relationship of short-term. The results of this study indicate that the Third Party Fund has a short-term effect on total funding, Return on Assets has a long term effect on the total financing, Non Performing Financing has long-term effects of total financing, and Financing deposit ratio has the effect of short-term and long-term of the total financing provided by Islamic Banks in Indonesia.

Keywords: Islamic banking, third party fund, return on asset, non-performing financing, financing deposit ratio

Procedia PDF Downloads 466
17828 Arduino Pressure Sensor Cushion for Tracking and Improving Sitting Posture

Authors: Andrew Hwang

Abstract:

The average American worker sits for thirteen hours a day, often with poor posture and infrequent breaks, which can lead to health issues and back problems. The Smart Cushion was created to alert individuals of their poor postures, and may potentially alleviate back problems and correct poor posture. The Smart Cushion is a portable, rectangular, foam cushion, with five strategically placed pressure sensors, that utilizes an Arduino Uno circuit board and specifically designed software, allowing it to collect data from the five pressure sensors and store the data on an SD card. The data is then compiled into graphs and compared to controlled postures. Before volunteers sat on the cushion, their levels of back pain were recorded on a scale from 1-10. Data was recorded for an hour during sitting, and then a new, corrected posture was suggested. After using the suggested posture for an hour, the volunteers described their level of discomfort on a scale from 1-10. Different patterns of sitting postures were generated that were able to serve as early warnings of potential back problems. By using the Smart Cushion, the areas where different volunteers were applying the most pressure while sitting could be identified, and the sitting postures could be corrected. Further studies regarding the relationships between posture and specific regions of the body are necessary to better understand the origins of back pain; however, the Smart Cushion is sufficient for correcting sitting posture and preventing the development of additional back pain.

Keywords: Arduino Sketch Algorithm, biomedical technology, pressure sensors, Smart Cushion

Procedia PDF Downloads 134
17827 Realizing Teleportation Using Black-White Hole Capsule Constructed by Space-Time Microstrip Circuit Control

Authors: Mapatsakon Sarapat, Mongkol Ketwongsa, Somchat Sonasang, Preecha Yupapin

Abstract:

The designed and performed preliminary tests on a space-time control circuit using a two-level system circuit with a 4-5 cm diameter microstrip for realistic teleportation have been demonstrated. It begins by calculating the parameters that allow a circuit that uses the alternative current (AC) at a specified frequency as the input signal. A method that causes electrons to move along the circuit perimeter starting at the speed of light, which found satisfaction based on the wave-particle duality. It is able to establish the supersonic speed (faster than light) for the electron cloud in the middle of the circuit, creating a timeline and propulsive force as well. The timeline is formed by the stretching and shrinking time cancellation in the relativistic regime, in which the absolute time has vanished. In fact, both black holes and white holes are created from time signals at the beginning, where the speed of electrons travels close to the speed of light. They entangle together like a capsule until they reach the point where they collapse and cancel each other out, which is controlled by the frequency of the circuit. Therefore, we can apply this method to large-scale circuits such as potassium, from which the same method can be applied to form the system to teleport living things. In fact, the black hole is a hibernation system environment that allows living things to live and travel to the destination of teleportation, which can be controlled from position and time relative to the speed of light. When the capsule reaches its destination, it increases the frequency of the black holes and white holes canceling each other out to a balanced environment. Therefore, life can safely teleport to the destination. Therefore, there must be the same system at the origin and destination, which could be a network. Moreover, it can also be applied to space travel as well. The design system will be tested on a small system using a microstrip circuit system that we can create in the laboratory on a limited budget that can be used in both wired and wireless systems.

Keywords: quantum teleportation, black-white hole, time, timeline, relativistic electronics

Procedia PDF Downloads 75
17826 Lexical Knowledge of Verb Particle Constructions with the Particle on by Mexican English Learners

Authors: Sarai Alvarado Pineda, Ricardo Maldonado Soto

Abstract:

The acquisition of Verb Particle Constructions is a challenge for Spanish speakers learning English. The acquisition is particularly difficult for speakers of languages with no verb particle constructions. The purpose of the current study is to define the procedural steps in the acquisition of constructions with the particle on. There are three outstanding meanings for the particle on; Surface: The movie is based on a true story, Activation: John turn on the light, Continuity: The band played on all night. The central aim of this study is to measure how Mexican Spanish participants respond to both the three meanings mentioned above and the degree of meaning transparency/opacity of on verb particle constructions. Forty Mexican Spanish learners of English (20 basic and 20 advanced) are compared against a control group of 20 American native English speakers through a reaction time test (PsychoPy2 2015). The participants were asked to discriminate 90 items based on their knowledge of these constructions. There are 30 items per meaning divided into two groups of transparent and opaque meaning. Results revealed three major findings: Advanced students have a reaction time similar to that of native speakers (advanced 4.5s versus native 3.7s), while students with a lower level of English proficiency, show a high reaction time (7s). Likewise, there is a shorter reaction time in constructions with lower opacity in the three groups of participants, with differences between each level (basic 6.7s, advanced 4.3s, and native 3.4s). Finally, a difference in reaction time can be identified according to the meaning provided by the construction. The reaction time for the activation category (5.27s) is greater than continuity (5.04s), and this category is also slower than the surface (4.94s). The study shows that the level of sensitivity of English learners increases significantly aiming towards native speaker patterns as determined by the level of transparency of meaning of each construction as well as the degree of entrenchment of each constructional meaning.

Keywords: meaning of the particle, opacity, reaction time, verb particle constructions

Procedia PDF Downloads 265
17825 Evaluation of Developmental Toxicity and Teratogenicity of Perfluoroalkyl Compounds Using FETAX

Authors: Hyun-Kyung Lee, Jehyung Oh, Young Eun Jeong, Hyun-Shik Lee

Abstract:

Perfluoroalkyl compounds (PFCs) are environmental toxicants that persistently accumulate in the human blood. Their widespread detection and accumulation in the environment raise concerns about whether these chemicals might be developmental toxicants and teratogens in the ecosystem. We evaluated and compared the toxicity of PFCs of containing various numbers of carbon atoms (C8-11 carbons) on vertebrate embryogenesis. We assessed the developmental toxicity and teratogenicity of various PFCs. The toxic effects on Xenopus embryos were evaluated using different methods. We measured teratogenic indices (TIs) and investigated the mechanisms underlying developmental toxicity and teratogenicity by measuring the expression of organ-specific biomarkers such as xPTB (liver), Nkx2.5 (heart), and Cyl18 (intestine). All PFCs that we tested were found to be developmental toxicants and teratogens. Their toxic effects were strengthened with increasing length of the fluorinated carbon chain. Furthermore, we produced evidence showing that perfluorodecanoic acid (PFDA) and perfluoroundecanoic acid (PFuDA) are more potent developmental toxicants and teratogens in an animal model compared to the other PFCs we evaluated [perfluorooctanoic acid (PFOA) and perfluorononanoic acid (PFNA)]. In particular, severe defects resulting from PFDA and PFuDA exposure were observed in the liver and heart, respectively, using the whole mount in situ hybridization, real-time PCR, pathologic analysis of the heart, and dissection of the liver. Our studies suggest that most PFCs are developmental toxicants and teratogens, however, compounds that have higher numbers of carbons (i.e., PFDA and PFuDA) exert more potent effects.

Keywords: PFC, xenopus, fetax, development

Procedia PDF Downloads 352
17824 Machine Learning Approach for Stress Detection Using Wireless Physical Activity Tracker

Authors: B. Padmaja, V. V. Rama Prasad, K. V. N. Sunitha, E. Krishna Rao Patro

Abstract:

Stress is a psychological condition that reduces the quality of sleep and affects every facet of life. Constant exposure to stress is detrimental not only for mind but also body. Nevertheless, to cope with stress, one should first identify it. This paper provides an effective method for the cognitive stress level detection by using data provided from a physical activity tracker device Fitbit. This device gathers people’s daily activities of food, weight, sleep, heart rate, and physical activities. In this paper, four major stressors like physical activities, sleep patterns, working hours and change in heart rate are used to assess the stress levels of individuals. The main motive of this system is to use machine learning approach in stress detection with the help of Smartphone sensor technology. Individually, the effect of each stressor is evaluated using logistic regression and then combined model is built and assessed using variants of ordinal logistic regression models like logit, probit and complementary log-log. Then the quality of each model is evaluated using Akaike Information Criterion (AIC) and probit is assessed as the more suitable model for our dataset. This system is experimented and evaluated in a real time environment by taking data from adults working in IT and other sectors in India. The novelty of this work lies in the fact that stress detection system should be less invasive as possible for the users.

Keywords: physical activity tracker, sleep pattern, working hours, heart rate, smartphone sensor

Procedia PDF Downloads 256
17823 How Children Synchronize with Their Teacher: Evidence from a Real-World Elementary School Classroom

Authors: Reiko Yamamoto

Abstract:

This paper reports on how synchrony occurs between children and their teacher, and what prevents or facilitates synchrony. The aim of the experiment conducted in this study was to precisely analyze their movements and synchrony and reveal the process of synchrony in a real-world classroom. Specifically, the experiment was conducted for around 20 minutes during an English as a foreign language (EFL) lesson. The participants were 11 fourth-grade school children and their classroom teacher in a public elementary school in Japan. Previous researchers assert that synchrony causes the state of flow in a class. For checking the level of flow, Short Flow State Scale (SFSS) was adopted. The experimental procedure had four steps: 1) The teacher read aloud the first half of an English storybook to the children. Both the teacher and the children were at their own desks. 2) The children were subjected to an SFSS check. 3) The teacher read aloud the remaining half of the storybook to the children. She made the children remove their desks before reading. 4) The children were again subjected to an SFSS check. The movements of all participants were recorded with a video camera. From the movement analysis, it was found that the children synchronized better with the teacher in Step 3 than in Step 1, and that the teacher’s movement became free and outstanding without a desk. This implies that the desk acted as a barrier between the children and the teacher. Removal of this barrier resulted in the children’s reactions becoming synchronized with those of the teacher. The SFSS results proved that the children experienced more flow without a barrier than with a barrier. Apparently, synchrony is what caused flow or social emotions in the classroom. The main conclusion is that synchrony leads to cognitive outcomes such as children’s academic performance in EFL learning.

Keywords: engagement in a class, English as a foreign language (EFL) learning, interactional synchrony, social emotions

Procedia PDF Downloads 143
17822 A Sustainable Pt/BaCe₁₋ₓ₋ᵧZrₓGdᵧO₃ Catalyst for Dry Reforming of Methane-Derived from Recycled Primary Pt

Authors: Alessio Varotto, Lorenzo Freschi, Umberto Pasqual Laverdura, Anastasia Moschovi, Davide Pumiglia, Iakovos Yakoumis, Marta Feroci, Maria Luisa Grilli

Abstract:

Dry reforming of Methane (DRM) is considered one of the most valuable technologies for green-house gas valorization thanks to the fact that through this reaction, it is possible to obtain syngas, a mixture of H₂ and CO in an H₂/CO ratio suitable for utilization in the Fischer-Tropsch process of high value-added chemicals and fuels. Challenges of the DRM process are the reduction of costs due to the high temperature of the process and the high cost of precious metals of the catalyst, the metal particles sintering, and carbon deposition on the catalysts’ surface. The aim of this study is to demonstrate the feasibility of the synthesis of catalysts using a leachate solution containing Pt coming directly from the recovery of spent diesel oxidation catalysts (DOCs) without further purification. An unusual perovskite support for DRM, the BaCe₁₋ₓ₋ᵧZrₓGdᵧO₃ (BCZG) perovskite, has been chosen as the catalyst support because of its high thermal stability and capability to produce oxygen vacancies, which suppress the carbon deposition and enhance the catalytic activity of the catalyst. BCZG perovskite has been synthesized by a sol-gel modified Pechini process and calcinated in air at 1100 °C. BCZG supports have been impregnated with a Pt-containing leachate solution of DOC, obtained by a mild hydrometallurgical recovery process, as reported elsewhere by some of the authors of this manuscript. For comparison reasons, a synthetic solution obtained by digesting commercial Pt-black powder in aqua regia was used for BCZG support impregnation. Pt nominal content was 2% in both BCZG-based catalysts formed by real and synthetic solutions. The structure and morphology of catalysts were characterized by X-Ray Diffraction (XRD) and Scanning Electron Microscopy (SEM). Thermogravimetric Analysis (TGA) was used to study the thermal stability of the catalyst’s samples. Brunauer-Emmett-Teller (BET) analysis provided a high surface area of the catalysts. H₂-TPR (Temperature Programmed Reduction) analysis was used to study the consumption of hydrogen for reducibility, and it was associated with H₂-TPD characterization to study the dispersion of Pt on the surface of the support and calculate the number of active sites used by the precious metal. Dry reforming of methane (DRM) reaction, carried out in a fixed bed reactor, showed a high conversion efficiency of CO₂ and CH4. At 850°C, CO₂ and CH₄ conversion were close to 100% for the catalyst obtained with the aqua regia-based solution of commercial Pt-black, and ~70% (for CH₄) and ~80 % (for CO₂) in the case of real HCl-based leachate solution. H₂/CO ratios were ~0.9 and ~0.70 in the first and latter cases, respectively. As far as we know, this is the first pioneering work in which a BCGZ catalyst and a real Pt-containing leachate solution were successfully employed for DRM reaction.

Keywords: dry reforming of methane, perovskite, PGM, recycled Pt, syngas

Procedia PDF Downloads 38
17821 Optimizing Emergency Rescue Center Layouts: A Backpropagation Neural Networks-Genetic Algorithms Method

Authors: Xiyang Li, Qi Yu, Lun Zhang

Abstract:

In the face of natural disasters and other emergency situations, determining the optimal location of rescue centers is crucial for improving rescue efficiency and minimizing impact on affected populations. This paper proposes a method that integrates genetic algorithms (GA) and backpropagation neural networks (BPNN) to address the site selection optimization problem for emergency rescue centers. We utilize BPNN to accurately estimate the cost of delivering supplies from rescue centers to each temporary camp. Moreover, a genetic algorithm with a special partially matched crossover (PMX) strategy is employed to ensure that the number of temporary camps assigned to each rescue center adheres to predetermined limits. Using the population distribution data during the 2022 epidemic in Jiading District, Shanghai, as an experimental case, this paper verifies the effectiveness of the proposed method. The experimental results demonstrate that the BPNN-GA method proposed in this study outperforms existing algorithms in terms of computational efficiency and optimization performance. Especially considering the requirements for computational resources and response time in emergency situations, the proposed method shows its ability to achieve rapid convergence and optimal performance in the early and mid-stages. Future research could explore incorporating more real-world conditions and variables into the model to further improve its accuracy and applicability.

Keywords: emergency rescue centers, genetic algorithms, back-propagation neural networks, site selection optimization

Procedia PDF Downloads 85
17820 Study of seum Tumor Necrosis Factor Alpha in Pediatric Patients with Hemophilia A

Authors: Sara Mohammad Atef Sabaika

Abstract:

Background: The development of factor VIII (FVIII) inhibitor and hemophilic arthropathy in patients with hemophilia A (PWHA) are a great challenge for hemophilia care. Both genetic and environmental factors led to complications in PWHA. The development of inhibitory antibodies is usually induced by the immune response. Tumor necrosis factor α (TNF-α), one of the cytokines, might contribute to its polymorphism. Aim: Study the association between tumor necrosis alpha level and genotypes in pediatric patients with hemophilia A and its relation to inhibitor development and joint status. Methods: A cross-sectional study was conducted among a sufficient number of PWHA attending the Pediatric Hematology and Oncology Unit, Pediatric department in Menoufia University hospital. The clinical parameters, FVIII, FVIII inhibitor, and serum TNF-α level were assessed. The genotyping of −380G > A TNF-α gene polymorphism was performed using real time polymerase chain reaction. Results: Among the 50 PWHA, 28 (56%) were identified as severe PWHA. The FVIII inhibitor was identified in 6/28 (21.5%) of severe PWHA. There was a significant correlation between serum TNF-α level and the development of inhibitor (p = 0:043). There was significant correlation between polymorphisms of −380G > A TNF-α gene and hemophilic arthropathy development (p = 0:645). Conclusion: The prevalence of FVIII inhibitor in severe PWHA in Menoufia was 21.5%. The frequency of replacement therapy is a risk factor for inhibitor development. Serum TNF-α level and its gene polymorphism might be used to predict inhibitor development and joint status in pediatric patients with hemophilia A.

Keywords: hemophilic arthropathy, TNF alpha., patients witb hemophilia A PWHA, inhibitor

Procedia PDF Downloads 94
17819 Testing of Protective Coatings on Automotive Steel, a Correlation Between Salt Spray, Electrochemical Impedance Spectroscopy, and Linear Polarization Resistance Test

Authors: Dhanashree Aole, V. Hariharan, Swati Surushe

Abstract:

Corrosion can cause serious and expensive damage to the automobile components. Various proven techniques for controlling and preventing corrosion depend on the specific material to be protected. Electrochemical Impedance Spectroscopy (EIS) and salt spray tests are commonly used to assess the corrosion degradation mechanism of coatings on metallic surfaces. While, the only test which monitors the corrosion rate in real time is known as Linear Polarisation Resistance (LPR). In this study, electrochemical tests (EIS & LPR) and spray test are reviewed to assess the corrosion resistance and durability of different coatings. The main objective of this study is to correlate the test results obtained using linear polarization resistance (LPR) and Electrochemical Impedance Spectroscopy (EIS) with the results obtained using standard salt spray test. Another objective of this work is to evaluate the performance of various coating systems- CED, Epoxy, Powder coating, Autophoretic, and Zn-trivalent coating for vehicle underbody application. The corrosion resistance coating are assessed. From this study, a promising correlation between different corrosion testing techniques is noted. The most profound observation is that electrochemical tests gives quick estimation of corrosion resistance and can detect the degradation of coatings well before visible signs of damage appear. Furthermore, the corrosion resistances and salt spray life of the coatings investigated were found to be according to the order as follows- CED> powder coating > Autophoretic > epoxy coating > Zn- Trivalent plating.

Keywords: Linear Polarization Resistance (LPR), Electrochemical Impedance Spectroscopy (EIS), salt spray test, sacrificial and barrier coatings

Procedia PDF Downloads 526
17818 Analysis of the Interventions Performed in Pediatric Cardiology Unit Based on Nursing Interventions Classification (NIC-6th): A Pilot Study

Authors: Ji Wen Sun, Nan Ping Shen, Yi Bei Wu

Abstract:

This study used Nursing Interventions Classification (NIC-6th) to identify the interventions performed in a pediatric cardiology unit, and then to analysis its frequency, time and difficulty, so as to give a brief review on what our nurses have done. The research team selected a 35 beds pediatric cardiology unit, and drawn all the nursing interventions in the nursing record from our hospital information system (HIS) from 1 October 2015 to 30 November 2015, using NIC-6th to do the matching and then counting their frequencies. Then giving each intervention its own time and difficulty code according to NIC-6th. The results showed that nurses in pediatric cardiology unit performed totally 43 interventions from 5394 statements, and most of them were in RN(basic) education level needed and less than 15 minutes time needed. There still had some interventions just needed by a nursing assistant but done by nurses, which should call for nurse managers to think about the suitable staffing. Thus, counting the summary of the product of frequency, time and difficulty for each intervention of each nurse can know one's performance. Acknowledgement Clinical Management Optimization Project of Shanghai Shen Kang Hospital Development Center (SHDC2014615); Hundred-Talent Program of Construction of Nursing Plateau Discipline (hlgy16073qnhb).

Keywords: nursing interventions, nursing interventions classification, nursing record, pediatric cardiology

Procedia PDF Downloads 364
17817 Enhanced CNN for Rice Leaf Disease Classification in Mobile Applications

Authors: Kayne Uriel K. Rodrigo, Jerriane Hillary Heart S. Marcial, Samuel C. Brillo

Abstract:

Rice leaf diseases significantly impact yield production in rice-dependent countries, affecting their agricultural sectors. As part of precision agriculture, early and accurate detection of these diseases is crucial for effective mitigation practices and minimizing crop losses. Hence, this study proposes an enhancement to the Convolutional Neural Network (CNN), a widely-used method for Rice Leaf Disease Image Classification, by incorporating MobileViTV2—a recently advanced architecture that combines CNN and Vision Transformer models while maintaining fewer parameters, making it suitable for broader deployment on edge devices. Our methodology utilizes a publicly available rice disease image dataset from Kaggle, which was validated by a university structural biologist following the guidelines provided by the Philippine Rice Institute (PhilRice). Modifications to the dataset include renaming certain disease categories and augmenting the rice leaf image data through rotation, scaling, and flipping. The enhanced dataset was then used to train the MobileViTV2 model using the Timm library. The results of our approach are as follows: the model achieved notable performance, with 98% accuracy in both training and validation, 6% training and validation loss, and a Receiver Operating Characteristic (ROC) curve ranging from 95% to 100% for each label. Additionally, the F1 score was 97%. These metrics demonstrate a significant improvement compared to a conventional CNN-based approach, which, in a previous 2022 study, achieved only 78% accuracy after using 5 convolutional layers and 2 dense layers. Thus, it can be concluded that MobileViTV2, with its fewer parameters, outperforms traditional CNN models, particularly when applied to Rice Leaf Disease Image Identification. For future work, we recommend extending this model to include datasets validated by international rice experts and broadening the scope to accommodate biotic factors such as rice pest classification, as well as abiotic stressors such as climate, soil quality, and geographic information, which could improve the accuracy of disease prediction.

Keywords: convolutional neural network, MobileViTV2, rice leaf disease, precision agriculture, image classification, vision transformer

Procedia PDF Downloads 25
17816 Simulation of Solar Assisted Absorption Cooling and Electricity Generation along with Thermal Storage

Authors: Faezeh Mosallat, Eric L. Bibeau, Tarek El Mekkawy

Abstract:

Availability of a wide variety of renewable resources, such as large reserves of hydro, biomass, solar and wind in Canada provides significant potential to improve the sustainability of energy uses. As buildings represent a considerable portion of energy use in Canada, application of distributed solar energy systems for heating and cooling may increase the amount of renewable energy use. Parabolic solar trough systems have seen limited deployments in cold northern climates as they are more suitable for electricity production in southern latitudes. Heat production by concentrating solar rays using parabolic troughs can overcome the poor efficiencies of flat panels and evacuated tubes in cold climates. A numerical dynamic model is developed to simulate an installed parabolic solar trough facility in Winnipeg. The results of the numerical model are validated using the experimental data obtained from this system. The model is developed in Simulink and will be utilized to simulate a tri-generation system for heating, cooling and electricity generation in remote northern communities. The main objective of this simulation is to obtain operational data of solar troughs in cold climates as this is lacking in the literature. In this paper, the validated Simulink model is applied to simulate a solar assisted absorption cooling system along with electricity generation using organic Rankine cycle (ORC) and thermal storage. A control strategy is employed to distribute the heated oil from solar collectors among the above three systems considering the temperature requirements. This modeling provides dynamic performance results using real time minutely meteorological data which are collected at the same location the solar system is installed. This is a big step ahead of the current models by accurately calculating the available solar energy at each time step considering the solar radiation fluctuations due to passing clouds. The solar absorption cooling is modeled to use the generated heat from the solar trough system and provide cooling in summer for a greenhouse which is located next to the solar field. A natural gas water heater provides the required excess heat for the absorption cooling at low or no solar radiation periods. The results of the simulation are presented for a summer month in Winnipeg which includes the amount of generated electric power from ORC and contribution of solar energy in the cooling load provision

Keywords: absorption cooling, parabolic solar trough, remote community, validated model

Procedia PDF Downloads 216
17815 How Virtualization, Decentralization, and Network-Building Change the Manufacturing Landscape: An Industry 4.0 Perspective

Authors: Malte Brettel, Niklas Friederichsen, Michael Keller, Marius Rosenberg

Abstract:

The German manufacturing industry has to withstand an increasing global competition on product quality and production costs. As labor costs are high, several industries have suffered severely under the relocation of production facilities towards aspiring countries, which have managed to close the productivity and quality gap substantially. Established manufacturing companies have recognized that customers are not willing to pay large price premiums for incremental quality improvements. As a consequence, many companies from the German manufacturing industry adjust their production focusing on customized products and fast time to market. Leveraging the advantages of novel production strategies such as Agile Manufacturing and Mass Customization, manufacturing companies transform into integrated networks, in which companies unite their core competencies. Hereby, virtualization of the process- and supply-chain ensures smooth inter-company operations providing real-time access to relevant product and production information for all participating entities. Boundaries of companies deteriorate, as autonomous systems exchange data, gained by embedded systems throughout the entire value chain. By including Cyber-Physical-Systems, advanced communication between machines is tantamount to their dialogue with humans. The increasing utilization of information and communication technology allows digital engineering of products and production processes alike. Modular simulation and modeling techniques allow decentralized units to flexibly alter products and thereby enable rapid product innovation. The present article describes the developments of Industry 4.0 within the literature and reviews the associated research streams. Hereby, we analyze eight scientific journals with regards to the following research fields: Individualized production, end-to-end engineering in a virtual process chain and production networks. We employ cluster analysis to assign sub-topics into the respective research field. To assess the practical implications, we conducted face-to-face interviews with managers from the industry as well as from the consulting business using a structured interview guideline. The results reveal reasons for the adaption and refusal of Industry 4.0 practices from a managerial point of view. Our findings contribute to the upcoming research stream of Industry 4.0 and support decision-makers to assess their need for transformation towards Industry 4.0 practices.

Keywords: Industry 4.0., mass customization, production networks, virtual process-chain

Procedia PDF Downloads 277
17814 Color Image Compression/Encryption/Contour Extraction using 3L-DWT and SSPCE Method

Authors: Ali A. Ukasha, Majdi F. Elbireki, Mohammad F. Abdullah

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. This paper is divided into two parts. This work interests with the color image which is decomposed into red, green and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using the key image that has same original size and are generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours from color images recovery can be obtained with accepted level of distortion using single step parallel contour extraction (SSPCE) method. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Color images and completely reconstructed without any distortion. Also shown that the analyzed algorithm has extremely large security against some attacks like salt and pepper and Jpeg compression. Its proof that the color images can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression and salt and peppers attacks, bitplanes decomposition, Arnold transform, color image, wavelet transform, lossless image encryption

Procedia PDF Downloads 518
17813 Analytical Study Of Holographic Polymer Dispersed Liquid Crystals Using Finite Difference Time Domain Method

Authors: N. R. Mohamad, H. Ono, H. Haroon, A. Salleh, N. M. Z. Hashim

Abstract:

In this research, we have studied and analyzed the modulation of light and liquid crystal in HPDLCs using Finite Domain Time Difference (FDTD) method. HPDLCs are modeled as a mixture of polymer and liquid crystals (LCs) that categorized as an anisotropic medium. FDTD method is directly solves Maxwell’s equation with less approximation, so this method can analyze more flexible and general approach for the arbitrary anisotropic media. As the results from FDTD simulation, the highest diffraction efficiency occurred at ±19 degrees (Bragg angle) using p polarization incident beam to Bragg grating, Q > 10 when the pitch is 1µm. Therefore, the liquid crystal is assumed to be aligned parallel to the grating constant vector during these parameters.

Keywords: birefringence, diffraction efficiency, finite domain time difference, nematic liquid crystals

Procedia PDF Downloads 460
17812 Pavement Quality Evaluation Using Intelligent Compaction Technology: Overview of Some Case Studies in Oklahoma

Authors: Sagar Ghos, Andrew E. Elaryan, Syed Ashik Ali, Musharraf Zaman, Mohammed Ashiqur Rahman

Abstract:

Achieving desired density during construction is an important indicator of pavement quality. Insufficient compaction often compromises pavement performance and service life. Intelligent compaction (IC) is an emerging technology for monitoring compaction quality during the construction of asphalt pavements. This paper aims to provide an overview of findings from four case studies in Oklahoma involving the compaction quality of asphalt pavements, namely SE 44th St project (Project 1) and EOC Turnpike project (Project 2), Highway 92 project (Project 3), and 108th Avenue project (Project 4). For this purpose, an IC technology, the intelligent compaction analyzer (ICA), developed at the University of Oklahoma, was used to evaluate compaction quality. Collected data include GPS locations, roller vibrations, roller speed, the direction of movement, and temperature of the asphalt mat. The collected data were analyzed using a widely used software, VETA. The average densities for Projects 1, 2, 3 and 4, were found as 89.8%, 91.50%, 90.7% and 87.5%, respectively. The maximum densities were found as 94.6%, 95.8%, 95.9%, and 89.7% for Projects 1, 2, 3, and 4, respectively. It was observed that the ICA estimated densities correlated well with the field core densities. The ICA results indicated that at least 90% of the asphalt mats were subjected to at least two roller passes. However, the number of passes required to achieve the desired density (94% to 97%) differed from project to project depending on the underlying layer. The results of these case studies show both opportunities and challenges in using IC for monitoring compaction quality during construction in real-time.

Keywords: asphalt pavement construction, density, intelligent compaction, intelligent compaction analyzer, intelligent compaction measure value

Procedia PDF Downloads 158
17811 An Improved Genetic Algorithm for Traveling Salesman Problem with Precedence Constraint

Authors: M. F. F. Ab Rashid, A. N. Mohd Rose, N. M. Z. Nik Mohamed, W. S. Wan Harun, S. A. Che Ghani

Abstract:

Traveling salesman problem with precedence constraint (TSPPC) is one of the most complex problems in combinatorial optimization. The existing algorithms to solve TSPPC cost large computational time to find the optimal solution. The purpose of this paper is to present an efficient genetic algorithm that guarantees optimal solution with less number of generations and iterations time. Unlike the existing algorithm that generates priority factor as chromosome, the proposed algorithm directly generates sequence of solution as chromosome. As a result, the proposed algorithm is capable of generating optimal solution with smaller number of generations and iteration time compare to existing algorithm.

Keywords: traveling salesman problem, sequencing, genetic algorithm, precedence constraint

Procedia PDF Downloads 560
17810 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition

Authors: Aisultan Shoiynbek, Darkhan Kuanyshbay, Paulo Menezes, Akbayan Bekarystankyzy, Assylbek Mukhametzhanov, Temirlan Shoiynbek

Abstract:

Speech emotion recognition (SER) has received increasing research interest in recent years. It is a common practice to utilize emotional speech collected under controlled conditions recorded by actors imitating and artificially producing emotions in front of a microphone. There are four issues related to that approach: emotions are not natural, meaning that machines are learning to recognize fake emotions; emotions are very limited in quantity and poor in variety of speaking; there is some language dependency in SER; consequently, each time researchers want to start work with SER, they need to find a good emotional database in their language. This paper proposes an approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describes the sequence of actions involved in the proposed approach. One of the first objectives in the sequence of actions is the speech detection issue. The paper provides a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To investigate the working capacity of the developed model, an analysis of speech detection and extraction from real tasks has been performed.

Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset

Procedia PDF Downloads 26
17809 Use of Electrokinetic Technology to Enhance Chemical and Biological Remediation of Contaminated Sands and Soils

Authors: Brian Wartell, Michel Boufadel

Abstract:

Contaminants such as polycyclic aromatic hydrocarbons (PAHs) are compounds present in crude and petroleum oils and are known to be toxic and often carcinogenic. Therefore, a major effort is placed on tracking their subsurface soil concentrations following an oil spill. The PAHs can persist for years in the subsurface especially if there is a lack of oxygen. Both aerobic and anaerobic biodegradation of PAHs encounter the difficulties of both nutrient transport and bioavailability (proximal access) to the organisms of the contaminants. A technology, known as electrokinetics (EK or EK-BIO for ‘electrokinetic bioremediation’) has been found to transport efficiently nutrients or other chemicals in the subsurface. Experiments were conducted to demonstrate migration patterns in both sands and clay for both ionic and nonionic compounds and aerobic biodegradation studies were conducted with soil spiked with Polycyclic Aromatic Hydrocarbons yielding interesting results. In one set of experiment, Self-designed electrokinetic setups were constructed to examine the differences in electromigration and electroosmotic rates. Anionic and non-ionic dyes were used to visualize these phenomena, respectively. In another experiment, a silt-clay soil was spiked with three low-molecular-weight compounds (fluorene, phenanthrene, fluoranthene) and placed within self-designed electrokinetic setups and monitored for aerobic degradation. Plans for additional studies are in progress including the transport of peroxide through anaerobic sands.

Keywords: bioavailability, bioremediation, electrokinetics, subsurface transport

Procedia PDF Downloads 155
17808 Analysis of Public Space Usage Characteristics Based on Computer Vision Technology - Taking Shaping Park as an Example

Authors: Guantao Bai

Abstract:

Public space is an indispensable and important component of the urban built environment. How to more accurately evaluate the usage characteristics of public space can help improve its spatial quality. Compared to traditional survey methods, computer vision technology based on deep learning has advantages such as dynamic observation and low cost. This study takes the public space of Shaping Park as an example and, based on deep learning computer vision technology, processes and analyzes the image data of the public space to obtain the spatial usage characteristics and spatiotemporal characteristics of the public space. Research has found that the spontaneous activity time in public spaces is relatively random with a relatively short average activity time, while social activities have a relatively stable activity time with a longer average activity time. Computer vision technology based on deep learning can effectively describe the spatial usage characteristics of the research area, making up for the shortcomings of traditional research methods and providing relevant support for creating a good public space.

Keywords: computer vision, deep learning, public spaces, using features

Procedia PDF Downloads 70
17807 Entropy-Based Multichannel Stationary Measure for Characterization of Non-Stationary Patterns

Authors: J. D. Martínez-Vargas, C. Castro-Hoyos, G. Castellanos-Dominguez

Abstract:

In this work, we propose a novel approach for measuring the stationarity level of a multichannel time-series. This measure is based on a stationarity definition over time-varying spectrum, and it is aimed to quantify the relation between local stationarity (single-channel) and global dynamic behavior (multichannel dynamics). To assess the proposed approach validity, we use a well known EEG-BCI database, that was constructed for separate between motor/imagery tasks. Thus, based on the statement that imagination of movements implies an increase on the EEG dynamics, we use as discriminant features the proposed measure computed over an estimation of the non-stationary components of input time-series. As measure of separability we use a t-student test, and the obtained results evidence that such measure is able to accurately detect the brain areas projected on the scalp where motor tasks are realized.

Keywords: stationary measure, entropy, sub-space projection, multichannel dynamics

Procedia PDF Downloads 412
17806 An Improved Method to Compute Sparse Graphs for Traveling Salesman Problem

Authors: Y. Wang

Abstract:

The Traveling salesman problem (TSP) is NP-hard in combinatorial optimization. The research shows the algorithms for TSP on the sparse graphs have the shorter computation time than those for TSP according to the complete graphs. We present an improved iterative algorithm to compute the sparse graphs for TSP by frequency graphs computed with frequency quadrilaterals. The iterative algorithm is enhanced by adjusting two parameters of the algorithm. The computation time of the algorithm is O(CNmaxn2) where C is the iterations, Nmax is the maximum number of frequency quadrilaterals containing each edge and n is the scale of TSP. The experimental results showed the computed sparse graphs generally have less than 5n edges for most of these Euclidean instances. Moreover, the maximum degree and minimum degree of the vertices in the sparse graphs do not have much difference. Thus, the computation time of the methods to resolve the TSP on these sparse graphs will be greatly reduced.

Keywords: frequency quadrilateral, iterative algorithm, sparse graph, traveling salesman problem

Procedia PDF Downloads 233