Search results for: asset monitoring
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3536

Search results for: asset monitoring

2726 Determination of Pesticides Residues in Tissue of Two Freshwater Fish Species by Modified QuEChERS Method

Authors: Iwona Cieślik, Władysław Migdał, Kinga Topolska, Ewa Cieślik

Abstract:

The consumption of fish is recommended as a means of preventing serious diseases, especially cardiovascular problems. Fish is known to be a valuable source of protein (rich in essential amino acids), unsaturated fatty acids, fat-soluble vitamins, macro- and microelements. However, it can also contain several contaminants (e.g. pesticides, heavy metals) that may pose considerable risks for humans. Among others, pesticide are of special concern. Their widespread use has resulted in the contamination of environmental compartments, including water. The occurrence of pesticides in the environment is a serious problem, due to their potential toxicity. Therefore, a systematic monitoring is needed. The aim of the study was to determine the organochlorine and organophosphate pesticide residues in fish muscle tissues of the pike (Esox lucius, L.) and the rainbow trout (Oncorhynchus mykkis, Walbaum) by a modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method, using Gas Chromatography Quadrupole Mass Spectrometry (GC/Q-MS), working in selected-ion monitoring (SIM) mode. The analysis of α-HCH, β-HCH, lindane, diazinon, disulfoton, δ-HCH, methyl parathion, heptachlor, malathion, aldrin, parathion, heptachlor epoxide, γ-chlordane, endosulfan, α-chlordane, o,p'-DDE, dieldrin, endrin, 4,4'-DDD, ethion, endrin aldehyde, endosulfan sulfate, 4,4'-DDT, and metoxychlor was performed in the samples collected in the Carp Valley (Malopolska region, Poland). The age of the pike (n=6) was 3 years and its weight was 2-3 kg, while the age of the rainbow trout (n=6) was 0.5 year and its weight was 0.5-1.0 kg. Detectable pesticide (HCH isomers, endosulfan isomers, DDT and its metabolites as well as metoxychlor) residues were present in fish samples. However, all these compounds were below the limit of quantification (LOQ). The other examined pesticide residues were below the limit of detection (LOD). Therefore, the levels of contamination were - in all cases - below the default Maximum Residue Levels (MRLs), established by Regulation (EC) No 396/2005 of the European Parliament and of the Council. The monitoring of pesticide residues content in fish is required to minimize potential adverse effects on the environment and human exposure to these contaminants.

Keywords: contaminants, fish, pesticides residues, QuEChERS method

Procedia PDF Downloads 217
2725 Trace Analysis of Genotoxic Impurity Pyridine in Sitagliptin Drug Material Using UHPLC-MS

Authors: Bashar Al-Sabti, Jehad Harbali

Abstract:

Background: Pyridine is a reactive base that might be used in preparing sitagliptin. International Agency for Research on Cancer classifies pyridine in group 2B; this classification means that pyridine is possibly carcinogenic to humans. Therefore, pyridine should be monitored at the allowed limit in sitagliptin pharmaceutical ingredients. Objective: The aim of this study was to develop a novel ultra high performance liquid chromatography mass spectrometry (UHPLC-MS) method to estimate the quantity of pyridine impurity in sitagliptin pharmaceutical ingredients. Methods: The separation was performed on C8 shim-pack (150 mm X 4.6 mm, 5 µm) in reversed phase mode using a mobile phase of water-methanol-acetonitrile containing 4 mM ammonium acetate in gradient mode. Pyridine was detected by mass spectrometer using selected ionization monitoring mode at m/z = 80. The flow rate of the method was 0.75 mL/min. Results: The method showed excellent sensitivity with a quantitation limit of 1.5 ppm of pyridine relative to sitagliptin. The linearity of the method was excellent at the range of 1.5-22.5 ppm with a correlation coefficient of 0.9996. Recoveries values were between 93.59-103.55%. Conclusions: The results showed good linearity, precision, accuracy, sensitivity, selectivity, and robustness. The studied method was applied to test three batches of sitagliptin raw materials. Highlights: This method is useful for monitoring pyridine in sitagliptin during its synthesis and testing sitagliptin raw materials before using them in the production of pharmaceutical products.

Keywords: genotoxic impurity, pyridine, sitagliptin, UHPLC -MS

Procedia PDF Downloads 93
2724 Towards Conservation and Recovery of Species at Risk in Ontario: Progress on Recovery Planning and Implementation and an Overview of Key Research Needs

Authors: Rachel deCatanzaro, Madeline Austen, Ken Tuininga, Kathy St. Laurent, Christina Rohe

Abstract:

In Canada, the federal Species at Risk Act (SARA) provides protection for wildlife species at risk and a national legislative framework for the conservation or recovery of species that are listed as endangered, threatened, or special concern under Schedule 1 of SARA. Key aspects of the federal species at risk program include the development of recovery documents (recovery strategies, action plans, and management plans) outlining threats, objectives, and broad strategies or measures for conservation or recovery of the species; the identification and protection of critical habitat for threatened and endangered species; and working with groups and organizations to implement on-the-ground recovery actions. Environment Canada’s progress on the development of recovery documents and on the identification and protection of critical habitat in Ontario will be presented, along with successes and challenges associated with on-the ground implementation of recovery actions. In Ontario, Environment Canada is currently involved in several recovery and monitoring programs for at-risk bird species such as the Loggerhead Shrike, Piping Plover, Golden-winged Warbler and Cerulean Warbler and has provided funding for a wide variety of recovery actions targeting priority species at risk and geographic areas each year through stewardship programs including the Habitat Stewardship Program, Aboriginal Fund for Species at Risk, and the Interdepartmental Recovery Fund. Key research needs relevant to the recovery of species at risk have been identified, and include: surveys and monitoring of population sizes and threats, population viability analyses, and addressing knowledge gaps identified for individual species (e.g., species biology and habitat needs). The engagement of all levels of government, the local and international conservation communities, and the scientific research community plays an important role in the conservation and recovery of species at risk in Ontario– through surveying and monitoring, filling knowledge gaps, conducting public outreach, and restoring, protecting, or managing habitat – and will be critical to the continued success of the federal species at risk program.

Keywords: conservation biology, habitat protection, species at risk, wildlife recovery

Procedia PDF Downloads 450
2723 Comprehensive Multilevel Practical Condition Monitoring Guidelines for Power Cables in Industries: Case Study of Mobarakeh Steel Company in Iran

Authors: S. Mani, M. Kafil, E. Asadi

Abstract:

Condition Monitoring (CM) of electrical equipment has gained remarkable importance during the recent years; due to huge production losses, substantial imposed costs and increases in vulnerability, risk and uncertainty levels. Power cables feed numerous electrical equipment such as transformers, motors, and electric furnaces; thus their condition assessment is of a very great importance. This paper investigates electrical, structural and environmental failure sources, all of which influence cables' performances and limit their uptimes; and provides a comprehensive framework entailing practical CM guidelines for maintenance of cables in industries. The multilevel CM framework presented in this study covers performance indicative features of power cables; with a focus on both online and offline diagnosis and test scenarios, and covers short-term and long-term threats to the operation and longevity of power cables. The study, after concisely overviewing the concept of CM, thoroughly investigates five major areas of power quality, Insulation Quality features of partial discharges, tan delta and voltage withstand capabilities, together with sheath faults, shield currents and environmental features of temperature and humidity; and elaborates interconnections and mutual impacts between those areas; using mathematical formulation and practical guidelines. Detection, location, and severity identification methods for every threat or fault source are also elaborated. Finally, the comprehensive, practical guidelines presented in the study are presented for the specific case of Electric Arc Furnace (EAF) feeder MV power cables in Mobarakeh Steel Company (MSC), the largest steel company in MENA region, in Iran. Specific technical and industrial characteristics and limitations of a harsh industrial environment like MSC EAF feeder cable tunnels are imposed on the presented framework; making the suggested package more practical and tangible.

Keywords: condition monitoring, diagnostics, insulation, maintenance, partial discharge, power cables, power quality

Procedia PDF Downloads 227
2722 A Soft Computing Approach Monitoring of Heavy Metals in Soil and Vegetables in the Republic of Macedonia

Authors: Vesna Karapetkovska Hristova, M. Ayaz Ahmad, Julijana Tomovska, Biljana Bogdanova Popov, Blagojce Najdovski

Abstract:

The average total concentrations of heavy metals; (cadmium [Cd], copper [Cu], nickel [Ni], lead [Pb], and zinc [Zn]) were analyzed in soil and vegetables samples collected from the different region of Macedonia during the years 2010-2012. Basic soil properties such as pH, organic matter and clay content were also included in the study. The average concentrations of Cd, Cu, Ni, Pb, Zn in the A horizon (0-30 cm) of agricultural soils were as follows, respectively: 0.25, 5.3, 6.9, 15.2, 26.3 mg kg-1 of soil. We have found that neural networking model can be considered as a tool for prediction and spatial analysis of the processes controlling the metal transfer within the soil-and vegetables. The predictive ability of such models is well over 80% as compared to 20% for typical regression models. A radial basic function network reflects good predicting accuracy and correlation coefficients between soil properties and metal content in vegetables much better than the back-propagation method. Neural Networking / soft computing can support the decision-making processes at different levels, including agro ecology, to improve crop management based on monitoring data and risk assessment of metal transfer from soils to vegetables.

Keywords: soft computing approach, total concentrations, heavy metals, agricultural soils

Procedia PDF Downloads 366
2721 The Effect of Market Orientation on Business Performance of Auto Parts Industry

Authors: Vithaya Intraphimol

Abstract:

The purpose of this study is to investigate the relationship between market orientation and business performance through innovations that include product innovation and process innovation. Auto parts and accessories companies in Thailand were used as sample for this investigation. Survey research with structured questionnaire was used as the key instrument in collecting the data. The structural equation modeling (SEM) was assigned test the hypotheses. The sample size in this study requires the minimum sample size of 200. The result found that competitor orientation, and interfunctional coordination has an effect on product innovation. Moreover, interfunctional coordination has an effect on process innovation, and return on asset. This indicates that within- firm coordination has crucial to firms’ performances. The implication for practice, firms should support interfunctional coordination that members of different functional areas of an organization communicate and work together for the creation of value to target buyers they may have better profitability.

Keywords: auto parts industry, business performance, innovations, market orientation

Procedia PDF Downloads 307
2720 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region

Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R.M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari

Abstract:

Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool have been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will provide consistent, regular and reliable information regarding natural and anthropogenic ground motion phenomena all over Europe.

Keywords: ground displacements, InSAR, natural hazards, satellite imagery

Procedia PDF Downloads 217
2719 "Revolutionizing Geographic Data: CADmapper's Automated Precision in CAD Drawing Transformation"

Authors: Toleen Alaqqad, Kadi Alshabramiy, Suad Zaafarany, Basma Musallam

Abstract:

CADmapper is a significant tool of software for transforming geographic data into realistic CAD drawings. It speeds up and simplifies the conversion process by automating it. This allows architects, urban planners, engineers, and geographic information system (GIS) experts to solely concentrate on the imaginative and scientific parts of their projects. While the future incorporation of AI has the potential for further improvements, CADmapper's current capabilities make it an indispensable asset in the business. It covers a combination of 2D and 3D city and urban area models. The user can select a specific square section of the map to view, and the fee is based on the dimensions of the area being viewed. The procedure is straightforward: you choose the area you want, then pick whether or not to include topography. 3D architectural data (if available), followed by selecting whatever design program or CAD style you want to publish the document which contains more than 200 free broad town plans in DXF format. If you desire to specify a bespoke area, it's free up to 1 km2.

Keywords: cadmaper, gdata, 2d and 3d data conversion, automated cad drawing, urban planning software

Procedia PDF Downloads 67
2718 Application of the Hit or Miss Transform to Detect Dams Monitored for Water Quality Using Remote Sensing in South Africa

Authors: Brighton Chamunorwa

Abstract:

The current remote sensing of water quality procedures does not provide a step representing physical visualisation of the monitored dam. The application of the remote sensing of water quality techniques may benefit from use of mathematical morphology operators for shape identification. Given an input of dam outline, morphological operators such as the hit or miss transform identifies if the water body is present on input remotely sensed images. This study seeks to determine the accuracy of the hit or miss transform to identify dams monitored by the water resources authorities in South Africa on satellite images. To achieve this objective the study download a Landsat image acquired in winter and tested the capability of the hit or miss transform using shapefile boundaries of dams in the crocodile marico catchment. The results of the experiment show that it is possible to detect most dams on the Landsat image after the adjusting the erosion operator to detect pixel matching a percentage similarity of 80% and above. Successfully implementation of the current study contributes towards optimisation of mathematical morphology image operators. Additionally, the effort helps develop remote sensing of water quality monitoring with improved simulation of the conventional procedures.

Keywords: hit or miss transform, mathematical morphology, remote sensing, water quality monitoring

Procedia PDF Downloads 149
2717 The Management of Urban Facilities in the City of Chlef

Authors: Belakhdar Salah Brahim

Abstract:

The Urban management is a major element of social control of public space and thus the functioning of society. As such, it is a key element of a social conception of sustainable development. Also, it is a cross-cutting sector that relies on land management, infrastructure management, habitat management, management of social services, the management of economic development, etc. This study aims to study how urban management focusing on the study of problems related to urban waste management in developing countries. It appears from the study that the city management is to improve infrastructure and urban services in order to increase the city's development and improve living conditions in cities. It covers various aspects including management of urban space, economic management, administrative management, asset management or infrastructure and finally waste management. Environmental management is important because it solves the pollution problems of life and preserve resources for future generations. Changing perceptions of waste has led to the definition of new policies for integrated waste management requirements appropriate to the urban site.

Keywords: urbanization, urban management, environmental management, waste management

Procedia PDF Downloads 431
2716 Troubleshooting Petroleum Equipment Based on Wireless Sensors Based on Bayesian Algorithm

Authors: Vahid Bayrami Rad

Abstract:

In this research, common methods and techniques have been investigated with a focus on intelligent fault finding and monitoring systems in the oil industry. In fact, remote and intelligent control methods are considered a necessity for implementing various operations in the oil industry, but benefiting from the knowledge extracted from countless data generated with the help of data mining algorithms. It is a avoid way to speed up the operational process for monitoring and troubleshooting in today's big oil companies. Therefore, by comparing data mining algorithms and checking the efficiency and structure and how these algorithms respond in different conditions, The proposed (Bayesian) algorithm using data clustering and their analysis and data evaluation using a colored Petri net has provided an applicable and dynamic model from the point of view of reliability and response time. Therefore, by using this method, it is possible to achieve a dynamic and consistent model of the remote control system and prevent the occurrence of leakage in oil pipelines and refineries and reduce costs and human and financial errors. Statistical data The data obtained from the evaluation process shows an increase in reliability, availability and high speed compared to other previous methods in this proposed method.

Keywords: wireless sensors, petroleum equipment troubleshooting, Bayesian algorithm, colored Petri net, rapid miner, data mining-reliability

Procedia PDF Downloads 64
2715 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 392
2714 Hygro-Thermal Modelling of Timber Decks

Authors: Stefania Fortino, Petr Hradil, Timo Avikainen

Abstract:

Timber bridges have an excellent environmental performance, are economical, relatively easy to build and can have a long service life. However, the durability of these bridges is the main problem because of their exposure to outdoor climate conditions. The moisture content accumulated in wood for long periods, in combination with certain temperatures, may cause conditions suitable for timber decay. In addition, moisture content variations affect the structural integrity, serviceability and loading capacity of timber bridges. Therefore, the monitoring of the moisture content in wood is important for the durability of the material but also for the whole superstructure. The measurements obtained by the usual sensor-based techniques provide hygro-thermal data only in specific locations of the wood components. In this context, the monitoring can be assisted by numerical modelling to get more information on the hygro-thermal response of the bridges. This work presents a hygro-thermal model based on a multi-phase moisture transport theory to predict the distribution of moisture content, relative humidity and temperature in wood. Below the fibre saturation point, the multi-phase theory simulates three phenomena in cellular wood during moisture transfer, i.e., the diffusion of water vapour in the pores, the sorption of bound water and the diffusion of bound water in the cell walls. In the multi-phase model, the two water phases are separated, and the coupling between them is defined through a sorption rate. Furthermore, an average between the temperature-dependent adsorption and desorption isotherms is used. In previous works by some of the authors, this approach was found very suitable to study the moisture transport in uncoated and coated stress-laminated timber decks. Compared to previous works, the hygro-thermal fluxes on the external surfaces include the influence of the absorbed solar radiation during the time and consequently, the temperatures on the surfaces exposed to the sun are higher. This affects the whole hygro-thermal response of the timber component. The multi-phase model, implemented in a user subroutine of Abaqus FEM code, provides the distribution of the moisture content, the temperature and the relative humidity in a volume of the timber deck. As a case study, the hygro-thermal data in wood are collected from the ongoing monitoring of the stress-laminated timber deck of Tapiola Bridge in Finland, based on integrated humidity-temperature sensors and the numerical results are found in good agreement with the measurements. The proposed model, used to assist the monitoring, can contribute to reducing the maintenance costs of bridges, as well as the cost of instrumentation, and increase safety.

Keywords: moisture content, multi-phase models, solar radiation, timber decks, FEM

Procedia PDF Downloads 174
2713 Design of a Mhealth Therapy Management to Maintain Therapy Outcomes after Bariatric Surgery

Authors: A. Dudek, P. Tylec, G. Torbicz, P. Duda, K. Proniewska, P. Major, M. Pedziwiatr

Abstract:

Background: Conservative treatments of obesity, based only on a proper diet and physical activity, without the support of an interdisciplinary team of specialist does not bring satisfactory bariatric results. Long-term maintenance of a proper metabolic results after rapid weight loss due to bariatric surgery requires engagement from patients. Mobile health tool may offer alternative model that enhance participant engagement in keeping the therapy. Objective: We aimed to assess the influence of constant monitoring and subsequent motivational alerts in perioperative period and on post-operative effects in the bariatric patients. As well as the study was designed to identify factors conductive urge to change lifestyle after surgery. Methods: This prospective clinical control study was based on a usage of a designed prototype of bariatric mHealth system. The prepared application comprises central data management with a comprehensible interface dedicated for patients and data transfer module as a physician’s platform. Motivation system of a platform consist of motivational alerts, graphic outcome presentation, and patient communication center. Generated list of patients requiring urgent consultation and possibility of a constant contact with a specialist provide safety zone. 31 patients were enrolled in continuous monitoring program during a 6-month period along with typical follow-up visits. After one year follow-up, all patients were examined. Results: There were 20 active users of the proposed monitoring system during the entire duration of the study. After six months, 24 patients took a part in a control by telephone questionnaires. Among them, 75% confirmed that the application concept was an important element in the treatment. Active users of the application indicated as the most valuable features: motivation to continue treatment (11 users), graphical presentation of weight loss, and other parameters (7 users), the ability to contact a doctor (3 users). The three main drawbacks are technical errors (9 users), tedious questionnaires inside the application (5 users), and time-consuming tasks inside the system (2 users). Conclusions: Constant monitoring and successive motivational alerts to continue treatment is an appropriate tool in the treatment after bariatric surgery, mainly in the early post-operative period. Graphic presentation of data and continuous connection with a clinical staff seemed to be an element of motivation to continue treatment and a sense of security.

Keywords: bariatric surgery, mHealth, mobile health tool, obesity

Procedia PDF Downloads 112
2712 Smart Sensor Data to Predict Machine Performance with IoT-Based Machine Learning and Artificial Intelligence

Authors: C. J. Rossouw, T. I. van Niekerk

Abstract:

The global manufacturing industry is utilizing the internet and cloud-based services to further explore the anatomy and optimize manufacturing processes in support of the movement into the Fourth Industrial Revolution (4IR). The 4IR from a third world and African perspective is hindered by the fact that many manufacturing systems that were developed in the third industrial revolution are not inherently equipped to utilize the internet and services of the 4IR, hindering the progression of third world manufacturing industries into the 4IR. This research focuses on the development of a non-invasive and cost-effective cyber-physical IoT system that will exploit a machine’s vibration to expose semantic characteristics in the manufacturing process and utilize these results through a real-time cloud-based machine condition monitoring system with the intention to optimize the system. A microcontroller-based IoT sensor was designed to acquire a machine’s mechanical vibration data, process it in real-time, and transmit it to a cloud-based platform via Wi-Fi and the internet. Time-frequency Fourier analysis was applied to the vibration data to form an image representation of the machine’s behaviour. This data was used to train a Convolutional Neural Network (CNN) to learn semantic characteristics in the machine’s behaviour and relate them to a state of operation. The same data was also used to train a Convolutional Autoencoder (CAE) to detect anomalies in the data. Real-time edge-based artificial intelligence was achieved by deploying the CNN and CAE on the sensor to analyse the vibration. A cloud platform was deployed to visualize the vibration data and the results of the CNN and CAE in real-time. The cyber-physical IoT system was deployed on a semi-automated metal granulation machine with a set of trained machine learning models. Using a single sensor, the system was able to accurately visualize three states of the machine’s operation in real-time. The system was also able to detect a variance in the material being granulated. The research demonstrates how non-IoT manufacturing systems can be equipped with edge-based artificial intelligence to establish a remote machine condition monitoring system.

Keywords: IoT, cyber-physical systems, artificial intelligence, manufacturing, vibration analytics, continuous machine condition monitoring

Procedia PDF Downloads 87
2711 Comparison of On-Site Stormwater Detention Real Performance and Theoretical Simulations

Authors: Pedro P. Drumond, Priscilla M. Moura, Marcia M. L. P. Coelho

Abstract:

The purpose of On-site Stormwater Detention (OSD) system is to promote the detention of addition stormwater runoff caused by impervious areas, in order to maintain the peak flow the same as the pre-urbanization condition. In recent decades, these systems have been built in many cities around the world. However, its real efficiency continues to be unknown due to the lack of research, especially with regard to monitoring its real performance. Thus, this study aims to compare the water level monitoring data of an OSD built in Belo Horizonte/Brazil with the results of theoretical methods simulations, usually adopted in OSD design. There were made two theoretical simulations, one using the Rational Method and Modified Puls method and another using the Soil Conservation Service (SCS) method and Modified Puls method. The monitoring data were obtained with a water level sensor, installed inside the reservoir and connected to a data logger. The comparison of OSD performance was made for 48 rainfall events recorded from April/2015 to March/2017. The comparison of maximum water levels in the OSD showed that the results of the simulations with Rational/Puls and SCS/Puls methods were, on average 33% and 73%, respectively, lower than those monitored. The Rational/Puls results were significantly higher than the SCS/Puls results, only in the events with greater frequency. In the events with average recurrence interval of 5, 10 and 200 years, the maximum water heights were similar in both simulations. Also, the results showed that the duration of rainfall events was close to the duration of monitored hydrograph. The rising time and recession time of the hydrographs calculated with the Rational Method represented better the monitored hydrograph than SCS Method. The comparison indicates that the real discharge coefficient value could be higher than 0.61, adopted in Puls simulations. New researches evaluating OSD real performance should be developed. In order to verify the peak flow damping efficiency and the value of the discharge coefficient is necessary to monitor the inflow and outflow of an OSD, in addition to monitor the water level inside it.

Keywords: best management practices, on-site stormwater detention, source control, urban drainage

Procedia PDF Downloads 185
2710 Detection of Hepatitis B by the Use of Artifical Intelegence

Authors: Shizra Waris, Bilal Shoaib, Munib Ahmad

Abstract:

Background; The using of clinical decision support systems (CDSSs) may recover unceasing disease organization, which requires regular visits to multiple health professionals, treatment monitoring, disease control, and patient behavior modification. The objective of this survey is to determine if these CDSSs improve the processes of unceasing care including diagnosis, treatment, and monitoring of diseases. Though artificial intelligence is not a new idea it has been widely documented as a new technology in computer science. Numerous areas such as education business, medical and developed have made use of artificial intelligence Methods: The survey covers articles extracted from relevant databases. It uses search terms related to information technology and viral hepatitis which are published between 2000 and 2016. Results: Overall, 80% of studies asserted the profit provided by information technology (IT); 75% of learning asserted the benefits concerned with medical domain;25% of studies do not clearly define the added benefits due IT. The CDSS current state requires many improvements to hold up the management of liver diseases such as HCV, liver fibrosis, and cirrhosis. Conclusion: We concluded that the planned model gives earlier and more correct calculation of hepatitis B and it works as promising tool for calculating of custom hepatitis B from the clinical laboratory data.

Keywords: detection, hapataties, observation, disesese

Procedia PDF Downloads 154
2709 Monitoring Large-Coverage Forest Canopy Height by Integrating LiDAR and Sentinel-2 Images

Authors: Xiaobo Liu, Rakesh Mishra, Yun Zhang

Abstract:

Continuous monitoring of forest canopy height with large coverage is essential for obtaining forest carbon stocks and emissions, quantifying biomass estimation, analyzing vegetation coverage, and determining biodiversity. LiDAR can be used to collect accurate woody vegetation structure such as canopy height. However, LiDAR’s coverage is usually limited because of its high cost and limited maneuverability, which constrains its use for dynamic and large area forest canopy monitoring. On the other hand, optical satellite images, like Sentinel-2, have the ability to cover large forest areas with a high repeat rate, but they do not have height information. Hence, exploring the solution of integrating LiDAR data and Sentinel-2 images to enlarge the coverage of forest canopy height prediction and increase the prediction repeat rate has been an active research topic in the environmental remote sensing community. In this study, we explore the potential of training a Random Forest Regression (RFR) model and a Convolutional Neural Network (CNN) model, respectively, to develop two predictive models for predicting and validating the forest canopy height of the Acadia Forest in New Brunswick, Canada, with a 10m ground sampling distance (GSD), for the year 2018 and 2021. Two 10m airborne LiDAR-derived canopy height models, one for 2018 and one for 2021, are used as ground truth to train and validate the RFR and CNN predictive models. To evaluate the prediction performance of the trained RFR and CNN models, two new predicted canopy height maps (CHMs), one for 2018 and one for 2021, are generated using the trained RFR and CNN models and 10m Sentinel-2 images of 2018 and 2021, respectively. The two 10m predicted CHMs from Sentinel-2 images are then compared with the two 10m airborne LiDAR-derived canopy height models for accuracy assessment. The validation results show that the mean absolute error (MAE) for year 2018 of the RFR model is 2.93m, CNN model is 1.71m; while the MAE for year 2021 of the RFR model is 3.35m, and the CNN model is 3.78m. These demonstrate the feasibility of using the RFR and CNN models developed in this research for predicting large-coverage forest canopy height at 10m spatial resolution and a high revisit rate.

Keywords: remote sensing, forest canopy height, LiDAR, Sentinel-2, artificial intelligence, random forest regression, convolutional neural network

Procedia PDF Downloads 90
2708 Analysis of Resource Consumption Accounting as a New Approach to Management Accounting

Authors: Yousef Rostami Gharainy

Abstract:

This paper presents resource consumption accounting as an imaginative way to deal with management accounting which concentrates on administrators as the essential clients of the data and gives the best information of conventional management accounting. This system underscores that association's asset reasons costs, accordingly in costing frameworks the emphasis ought to be on assets and utilization of them. Resource consumption accounting consolidates two costing methodologies, action based and German cost accounting method known as GPK. This methodology notwithstanding giving a chance to managers to decide, makes task management accounting as operational. The reason for this article is to clarify the idea of resource consumption accounting, its parts and highlights and use of this strategy in associations. In the first place we deliver to presentation of resource consumption accounting, foundation, reasons for its development and the issues that past costing frameworks confronted it. At that point we give standards and presumptions of this technique; at last we depict the execution of this strategy in associations and its preferences over other costing strategies.

Keywords: resource consumption accounting, management accounting, action based method, German cost accounting method

Procedia PDF Downloads 308
2707 Open Source Cloud Managed Enterprise WiFi

Authors: James Skon, Irina Beshentseva, Michelle Polak

Abstract:

Wifi solutions come in two major classes. Small Office/Home Office (SOHO) WiFi, characterized by inexpensive WiFi routers, with one or two service set identifiers (SSIDs), and a single shared passphrase. These access points provide no significant user management or monitoring, and no aggregation of monitoring and control for multiple routers. The other solution class is managed enterprise WiFi solutions, which involve expensive Access Points (APs), along with (also costly) local or cloud based management components. These solutions typically provide portal based login, per user virtual local area networks (VLANs), and sophisticated monitoring and control across a large group of APs. The cost for deploying and managing such managed enterprise solutions is typically about 10 fold that of inexpensive consumer APs. Low revenue organizations, such as schools, non-profits, non-government organizations (NGO's), small businesses, and even homes cannot easily afford quality enterprise WiFi solutions, though they may need to provide quality WiFi access to their population. Using available lower cost Wifi solutions can significantly reduce their ability to provide reliable, secure network access. This project explored and created a new approach for providing secured managed enterprise WiFi based on low cost hardware combined with both new and existing (but modified) open source software. The solution provides a cloud based management interface which allows organizations to aggregate the configuration and management of small, medium and large WiFi solutions. It utilizes a novel approach for user management, giving each user a unique passphrase. It provides unlimited SSID's across an unlimited number of WiFI zones, and the ability to place each user (and all their devices) on their own VLAN. With proper configuration it can even provide user local services. It also allows for users' usage and quality of service to be monitored, and for users to be added, enabled, and disabled at will. As inferred above, the ultimate goal is to free organizations with limited resources from the expense of a commercial enterprise WiFi, while providing them with most of the qualities of such a more expensive managed solution at a fraction of the cost.

Keywords: wifi, enterprise, cloud, managed

Procedia PDF Downloads 97
2706 Ethnic Identity as an Asset: Linking Ethnic Identity, Perceived Social Support, and Mental Health among Indigenous Adults in Taiwan

Authors: A.H.Y. Lai, C. Teyra

Abstract:

In Taiwan, there are 16 official indigenous groups, accounting for 2.3% of the total population. Like other indigenous populations worldwide, indigenous peoples in Taiwan have poorer mental health because of their history of oppression and colonisation. Amid the negative narratives, the ethnic identity of cultural minorities is their unique psychological and cultural asset. Moreover, positive socialisation is found to be related to strong ethnic identity. Based on Phinney’s theory on ethnic identity development and social support theory, this study adopted a strength-based approach conceptualising ethnic identity as the central organising principle that linked perceived social support and mental health among indigenous adults in Taiwan. Aims. Overall aim is to examine the effect of ethnic identity and social support on mental health. Specific aims were to examine : (1) the association between ethnic identity and mental health; (2) the association between perceived social support and mental health ; (3) the indirect effect of ethnic identity linking perceived social support and mental health. Methods. Participants were indigenous adults in Taiwan (n=200; mean age=29.51; Female=31%, Male=61%, Others=8%). A cross-sectional quantitative design was implemented using data collected in the year 2020. Respondent-driven sampling was used. Standardised measurements were: Ethnic Identity Scale(6-item); Social Support Questionnaire-SF(6 items); Patient Health Questionnaire(9-item); and Generalised Anxiety Disorder(7-item). Covariates were age, gender and economic satisfaction. A four-stage structural equation modelling (SEM) with robust maximin likelihood estimation was employed using Mplus8.0. Step 1: A measurement model was built and tested using confirmatory factor analysis (CFA). Step 2: Factor covariates were re-specified as direct effects in the SEM. Covariates were added. The direct effects of (1) ethnic identity and social support on depression and anxiety and (2) social support on ethnic identity were tested. The indirect effect of ethnic identity was examined with the bootstrapping technique. Results. The CFA model showed satisfactory fit statistics: x^2(df)=869.69(608), p<.05; Comparative ft index (CFI)/ Tucker-Lewis fit index (TLI)=0.95/0.94; root mean square error of approximation (RMSEA)=0.05; Standardized Root Mean Squared Residual (SRMR)=0.05. Ethnic identity is represented by two latent factors: ethnic identity-commitment and ethnic identity-exploration. Depression, anxiety and social support are single-factor latent variables. For the SEM, model fit statistics were: x^2(df)=779.26(527), p<.05; CFI/TLI=0.94/0.93; RMSEA=0.05; SRMR=0.05. Ethnic identity-commitment (b=-0.30) and social support (b=-0.33) had direct negative effects on depression, but ethnic identity-exploration did not. Ethnic identity-commitment (b=-0.43) and social support (b=-0.31) had direct negative effects on anxiety, while identity-exploration (b=0.24) demonstrated a positive effect. Social support had direct positive effects on ethnic identity-exploration (b=0.26) and ethnic identity-commitment (b=0.31). Mediation analysis demonstrated the indirect effect of ethnic identity-commitment linking social support and depression (b=0.22). Implications: Results underscore the role of social support in preventing depression via ethnic identity commitment among indigenous adults in Taiwan. Adopting the strength-based approach, mental health practitioners can mobilise indigenous peoples’ commitment to their group to promote their well-being.

Keywords: ethnic identity, indigenous population, mental health, perceived social support

Procedia PDF Downloads 103
2705 Contactless Heart Rate Measurement System based on FMCW Radar and LSTM for Automotive Applications

Authors: Asma Omri, Iheb Sifaoui, Sofiane Sayahi, Hichem Besbes

Abstract:

Future vehicle systems demand advanced capabilities, notably in-cabin life detection and driver monitoring systems, with a particular emphasis on drowsiness detection. To meet these requirements, several techniques employ artificial intelligence methods based on real-time vital sign measurements. In parallel, Frequency-Modulated Continuous-Wave (FMCW) radar technology has garnered considerable attention in the domains of healthcare and biomedical engineering for non-invasive vital sign monitoring. FMCW radar offers a multitude of advantages, including its non-intrusive nature, continuous monitoring capacity, and its ability to penetrate through clothing. In this paper, we propose a system utilizing the AWR6843AOP radar from Texas Instruments (TI) to extract precise vital sign information. The radar allows us to estimate Ballistocardiogram (BCG) signals, which capture the mechanical movements of the body, particularly the ballistic forces generated by heartbeats and respiration. These signals are rich sources of information about the cardiac cycle, rendering them suitable for heart rate estimation. The process begins with real-time subject positioning, followed by clutter removal, computation of Doppler phase differences, and the use of various filtering methods to accurately capture subtle physiological movements. To address the challenges associated with FMCW radar-based vital sign monitoring, including motion artifacts due to subjects' movement or radar micro-vibrations, Long Short-Term Memory (LSTM) networks are implemented. LSTM's adaptability to different heart rate patterns and ability to handle real-time data make it suitable for continuous monitoring applications. Several crucial steps were taken, including feature extraction (involving amplitude, time intervals, and signal morphology), sequence modeling, heart rate estimation through the analysis of detected cardiac cycles and their temporal relationships, and performance evaluation using metrics such as Root Mean Square Error (RMSE) and correlation with reference heart rate measurements. For dataset construction and LSTM training, a comprehensive data collection system was established, integrating the AWR6843AOP radar, a Heart Rate Belt, and a smart watch for ground truth measurements. Rigorous synchronization of these devices ensured data accuracy. Twenty participants engaged in various scenarios, encompassing indoor and real-world conditions within a moving vehicle equipped with the radar system. Static and dynamic subject’s conditions were considered. The heart rate estimation through LSTM outperforms traditional signal processing techniques that rely on filtering, Fast Fourier Transform (FFT), and thresholding. It delivers an average accuracy of approximately 91% with an RMSE of 1.01 beat per minute (bpm). In conclusion, this paper underscores the promising potential of FMCW radar technology integrated with artificial intelligence algorithms in the context of automotive applications. This innovation not only enhances road safety but also paves the way for its integration into the automotive ecosystem to improve driver well-being and overall vehicular safety.

Keywords: ballistocardiogram, FMCW Radar, vital sign monitoring, LSTM

Procedia PDF Downloads 72
2704 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 137
2703 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: string classification, data quality, feature selection, probability distribution, string length

Procedia PDF Downloads 316
2702 Geographic Information Systems and a Breath of Opportunities for Supply Chain Management: Results from a Systematic Literature Review

Authors: Anastasia Tsakiridi

Abstract:

Geographic information systems (GIS) have been utilized in numerous spatial problems, such as site research, land suitability, and demographic analysis. Besides, GIS has been applied in scientific fields like geography, health, and economics. In business studies, GIS has been used to provide insights and spatial perspectives in demographic trends, spending indicators, and network analysis. To date, the information regarding the available usages of GIS in supply chain management (SCM) and how these analyses can benefit businesses is limited. A systematic literature review (SLR) of the last 5-year peer-reviewed academic literature was conducted, aiming to explore the existing usages of GIS in SCM. The searches were performed in 3 databases (Web of Science, ProQuest, and Business Source Premier) and reported using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology. The analysis resulted in 79 papers. The results indicate that the existing GIS applications used in SCM were in the following domains: a) network/ transportation analysis (in 53 of the papers), b) location – allocation site search/ selection (multiple-criteria decision analysis) (in 45 papers), c) spatial analysis (demographic or physical) (in 34 papers), d) combination of GIS and supply chain/network optimization tools (in 32 papers), and e) visualization/ monitoring or building information modeling applications (in 8 papers). An additional categorization of the literature was conducted by examining the usage of GIS in the supply chain (SC) by the business sectors, as indicated by the volume of the papers. The results showed that GIS is mainly being applied in the SC of the biomass biofuel/wood industry (33 papers). Other industries that are currently utilizing GIS in their SC were the logistics industry (22 papers), the humanitarian/emergency/health care sector (10 papers), the food/agro-industry sector (5 papers), the petroleum/ coal/ shale gas sector (3 papers), the faecal sludge sector (2 papers), the recycle and product footprint industry (2 papers), and the construction sector (2 papers). The results were also presented by the geography of the included studies and the GIS software used to provide critical business insights and suggestions for future research. The results showed that research case studies of GIS in SCM were conducted in 26 countries (mainly in the USA) and that the most prominent GIS software provider was the Environmental Systems Research Institute’s ArcGIS (in 51 of the papers). This study is a systematic literature review of the usage of GIS in SCM. The results showed that the GIS capabilities could offer substantial benefits in SCM decision-making by providing key insights to cost minimization, supplier selection, facility location, SC network configuration, and asset management. However, as presented in the results, only eight industries/sectors are currently using GIS in their SCM activities. These findings may offer essential tools to SC managers who seek to optimize the SC activities and/or minimize logistic costs and to consultants and business owners that want to make strategic SC decisions. Furthermore, the findings may be of interest to researchers aiming to investigate unexplored research areas where GIS may improve SCM.

Keywords: supply chain management, logistics, systematic literature review, GIS

Procedia PDF Downloads 141
2701 Econophysics: The Use of Entropy Measures in Finance

Authors: Muhammad Sheraz, Vasile Preda, Silvia Dedu

Abstract:

Concepts of econophysics are usually used to solve problems related to uncertainty and nonlinear dynamics. In the theory of option pricing the risk neutral probabilities play very important role. The application of entropy in finance can be regarded as the extension of both information entropy and the probability entropy. It can be an important tool in various financial methods such as measure of risk, portfolio selection, option pricing and asset pricing. Gulko applied Entropy Pricing Theory (EPT) for pricing stock options and introduced an alternative framework of Black-Scholes model for pricing European stock option. In this article, we present solutions to maximum entropy problems based on Tsallis, Weighted-Tsallis, Kaniadakis, Weighted-Kaniadakies entropies, to obtain risk-neutral densities. We have also obtained the value of European call and put in this framework.

Keywords: option pricing, Black-Scholes model, Tsallis entropy, Kaniadakis entropy, weighted entropy, risk-neutral density

Procedia PDF Downloads 302
2700 SEAWIZARD-Multiplex AI-Enabled Graphene Based Lab-On-Chip Sensing Platform for Heavy Metal Ions Monitoring on Marine Water

Authors: M. Moreno, M. Alique, D. Otero, C. Delgado, P. Lacharmoise, L. Gracia, L. Pires, A. Moya

Abstract:

Marine environments are increasingly threatened by heavy metal contamination, including mercury (Hg), lead (Pb), and cadmium (Cd), posing significant risks to ecosystems and human health. Traditional monitoring techniques often fail to provide the spatial and temporal resolution needed for real-time detection of these contaminants, especially in remote or harsh environments. SEAWIZARD addresses these challenges by leveraging the flexibility, adaptability, and cost-effectiveness of printed electronics, with the integration of microfluidics to develop a compact, portable, and reusable sensor platform designed specifically for real-time monitoring of heavy metal ions in seawater. The SEAWIZARD sensor is a multiparametric Lab-on-Chip (LoC) device, a miniaturized system that integrates several laboratory functions into a single chip, drastically reducing sample volumes and improving adaptability. This platform integrates three printed graphene electrodes for the simultaneous detection of Hg, Cd and Pb via square wave voltammetry. These electrodes share the reference and the counter electrodes to improve space efficiency. Additionally, it integrates printed pH and temperature sensors to correct environmental interferences that may impact the accuracy of metal detection. The pH sensor is based on a carbon electrode with iridium oxide electrodeposited while the temperature sensor is graphene based. A protective dielectric layer is printed on top of the sensor to safeguard it in harsh marine conditions. The use of flexible polyethylene terephthalate (PET) as the substrate enables the sensor to conform to various surfaces and operate in challenging environments. One of the key innovations of SEAWIZARD is its integrated microfluidic layer, fabricated from cyclic olefin copolymer (COC). This microfluidic component allows a controlled flow of seawater over the sensing area, allowing for significant improved detection limits compared to direct water sampling. The system’s dual-channel design separates the detection of heavy metals from the measurement of pH and temperature, ensuring that each parameter is measured under optimal conditions. In addition, the temperature sensor is finely tuned with a serpentine-shaped microfluidic channel to ensure precise thermal measurements. SEAWIZARD also incorporates custom electronics that allow for wireless data transmission via Bluetooth, facilitating rapid data collection and user interface integration. Embedded artificial intelligence further enhances the platform by providing an automated alarm system, capable of detecting predefined metal concentration thresholds and issuing warnings when limits are exceeded. This predictive feature enables early warnings of potential environmental disasters, such as industrial spills or toxic levels of heavy metal pollutants, making SEAWIZARD not just a detection tool, but a comprehensive monitoring and early intervention system. In conclusion, SEAWIZARD represents a significant advancement in printed electronics applied to environmental sensing. By combining flexible, low-cost materials with advanced microfluidics, custom electronics, and AI-driven intelligence, SEAWIZARD offers a highly adaptable and scalable solution for real-time, high-resolution monitoring of heavy metals in marine environments. Its compact and portable design makes it an accessible, user-friendly tool with the potential to transform water quality monitoring practices and provide critical data to protect marine ecosystems from contamination-related risks.

Keywords: lab-on-chip, printed electronics, real-time monitoring, microfluidics, heavy metal contamination

Procedia PDF Downloads 27
2699 Employee Engagement: Tool for Success of Higher Education in Thailand

Authors: Pooree Sakot, Marndarath Suksanga

Abstract:

Organizations are under increasing pressure to improve performance and maximize the contribution of every employee. Employee engagement has become an attractive business proposition. The triple bottom line consists of three Ps: profit, people and planet. It aims to measure the financial, social and environmental performance of the corporation over a period of time. People are the most important asset of every organization. Most of the studies suggest that employee engagement improves the bottom line in almost every instance and it is well worth all organizational efforts to actively engage employees. Engaged employees have an impact on productivity and financial performance. Efficient leadership and effective management can take place if emerging paradigm like employee engagement is appropriately understood and put into practice. Employee engagement starts at the first step i.e. recruitment of an employee to the last step i.e. retirement .The HR Practices of an organization play the most major role in helping the employees walk the extra mile. Effective employee engagement is the key component for improved organizational performance.

Keywords: employee engagement, higher education, tool, success

Procedia PDF Downloads 334
2698 The Comparation of Limits of Detection of Lateral Flow Immunochromatographic Strips of Different Types of Mycotoxins

Authors: Xinyi Zhao, Furong Tian

Abstract:

Mycotoxins are secondary metabolic products of fungi. These are poisonous, carcinogens and mutagens in nature and pose a serious health threat to both humans and animals, causing severe illnesses and even deaths. The rapid, simple and cheap detection methods of mycotoxins are of immense importance and in great demand in the food and beverage industry as well as in agriculture and environmental monitoring. Lateral flow immunochromatographic strips (ICSTs) have been widely used in food safety, environment monitoring. Forty-six papers were identified and reviewed on Google Scholar and Scopus for their limit of detection and nanomaterial on Lateral flow immunochromatographic strips on different types of mycotoxins. The papers were dated 2001-2021. Twenty five papers were compared to identify the lowest limit of detection of among different mycotoxins (Aflatoxin B1: 10, Zearalenone:5, Fumonisin B1: 5, Trichothecene-A: 5). Most of these highly sensitive strips are competitive. Sandwich structure are usually used in large scale detection. In conclusion, the mycotoxin receives that most researches is aflatoxin B1 and its limit of detection is the lowest. Gold-nanopaticle based immunochromatographic test strips has the lowest limit of detection. Five papers involve smartphone detection and they all detect aflatoxin B1 with gold nanoparticles. In these papers, quantitative concentration results can be obtained when the user uploads the photograph of test lines using the smartphone application.

Keywords: aflatoxin B1, limit of detection, gold nanoparticle, lateral flow immunochromatographic strips, mycotoxins

Procedia PDF Downloads 193
2697 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies

Authors: Yuanjin Liu

Abstract:

Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.

Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model

Procedia PDF Downloads 72