Search results for: real time data processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40567

Search results for: real time data processing

39967 Intelligent Technology for Real-Time Monitor and Data Analysis of the Aquaculture Toxic Water Concentration

Authors: Chin-Yuan Hsieh, Wei-Chun Lu, Yu-Hong Zeng

Abstract:

The situation of a group of fish die is frequently found due to the fish disease caused by the deterioration of aquaculture water quality. The toxic ammonia is produced by animals as a byproduct of protein. The system is designed by the smart sensor technology and developed by the mathematical model to monitor the water parameters 24 hours a day and predict the relationship among twelve water quality parameters for monitoring the water quality in aquaculture. All data measured are stored in cloud server. In productive ponds, the daytime pH may be high enough to be lethal to the fish. The sudden change of the aquaculture conditions often results in the increase of PH value of water, lack of oxygen dissolving content, water quality deterioration and yield reduction. From the real measurement, the system can send the message to user’s smartphone successfully on the bad conditions of water quality. From the data comparisons between measurement and model simulation in fish aquaculture site, the difference of parameters is less than 2% and the correlation coefficient is at least 98.34%. The solubility rate of oxygen decreases exponentially with the elevation of water temperature. The correlation coefficient is 98.98%.

Keywords: aquaculture, sensor, ammonia, dissolved oxygen

Procedia PDF Downloads 283
39966 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service

Authors: Lai Wenfang

Abstract:

Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.

Keywords: artificial intelligence, natural language processing, machine learning, visualization

Procedia PDF Downloads 174
39965 A Study on the New Weapon Requirements Analytics Using Simulations and Big Data

Authors: Won Il Jung, Gene Lee, Luis Rabelo

Abstract:

Since many weapon systems are getting more complex and diverse, various problems occur in terms of the acquisition cost, time, and performance limitation. As a matter of fact, the experiment execution in real world is costly, dangerous, and time-consuming to obtain Required Operational Characteristics (ROC) for a new weapon acquisition although enhancing the fidelity of experiment results. Also, until presently most of the research contained a large amount of assumptions so therefore a bias is present in the experiment results. At this moment, the new methodology is proposed to solve these problems without a variety of assumptions. ROC of the new weapon system is developed through the new methodology, which is a way to analyze big data generated by simulating various scenarios based on virtual and constructive models which are involving 6 Degrees of Freedom (6DoF). The new methodology enables us to identify unbiased ROC on new weapons by reducing assumptions and provide support in terms of the optimal weapon systems acquisition.

Keywords: big data, required operational characteristics (ROC), virtual and constructive models, weapon acquisition

Procedia PDF Downloads 289
39964 Smart Surveillance with 5G: A Performance Study in Adama City

Authors: Shenko Chura Aredo, Hailu Belay, Kevin T. Kornegay

Abstract:

In light of Adama City’s smart city development vision, this study thoroughly investigates the performance of smart security systems with Fifth Generation (5G) network capabilities. It can be logistically difficult to install a lot of cabling, particularly in big or dynamic settings. Moreover, latency issues might affect linked systems, making it difficult for them to monitor in real time. Through a focused analysis that employs Adama City as a case study, the performance has been evaluated in terms of spectrum and energy efficiency using empirical data and basic signal processing formulations at different frequency resources. The findings also demonstrate that cameras working at higher 5G frequencies have more capacity than those operating at sub-6 GHz, notwithstanding frequency-related issues. It has also been noted that when the beams of such cameras are adaptively focussed based on the distance of the last cell edge user rather than the maximum cell radius, less energy is required than with conventional fixed power ramping.

Keywords: 5G, energy efficiency, safety, smart security, spectral efficiency

Procedia PDF Downloads 18
39963 FPGA Implementation of Adaptive Clock Recovery for TDMoIP Systems

Authors: Semih Demir, Anil Celebi

Abstract:

Circuit switched networks widely used until the end of the 20th century have been transformed into packages switched networks. Time Division Multiplexing over Internet Protocol (TDMoIP) is a system that enables Time Division Multiplexing (TDM) traffic to be carried over packet switched networks (PSN). In TDMoIP systems, devices that send TDM data to the PSN and receive it from the network must operate with the same clock frequency. In this study, it was aimed to implement clock synchronization process in Field Programmable Gate Array (FPGA) chips using time information attached to the packages received from PSN. The designed hardware is verified using the datasets obtained for the different carrier types and comparing the results with the software model. Field tests are also performed by using the real time TDMoIP system.

Keywords: clock recovery on TDMoIP, FPGA, MATLAB reference model, clock synchronization

Procedia PDF Downloads 278
39962 Optimised Path Recommendation for a Real Time Process

Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa

Abstract:

Traditional execution process follows the path of execution drawn by the process analyst without observing the behaviour of resource and other real-time constraints. Identifying process model, predicting the behaviour of resource and recommending the optimal path of execution for a real time process is challenging. The proposed AlfyMiner: αyM iner gives a new dimension in process execution with the novel techniques Process Model Analyser: PMAMiner and Resource behaviour Analyser: RBAMiner for recommending the probable path of execution. PMAMiner discovers next probable activity for currently executing activity in an online process using variant matching technique to identify the set of next probable activity, among which the next probable activity is discovered using decision tree model. RBAMiner identifies the resource suitable for performing the discovered next probable activity and observe the behaviour based on; load and performance using polynomial regression model, and waiting time using queueing theory. Based on the observed behaviour αyM iner recommend the probable path of execution with; next probable activity and the best suitable resource for performing it. Experiments were conducted on process logs of CoSeLoG Project1 and 72% of accuracy is obtained in identifying and recommending next probable activity and the efficiency of resource performance was optimised by 59% by decreasing their load.

Keywords: cross-organization process mining, process behaviour, path of execution, polynomial regression model

Procedia PDF Downloads 334
39961 Smart Textiles Integration for Monitoring Real-time Air Pollution

Authors: Akshay Dirisala

Abstract:

Humans had developed a highly organized and efficient civilization to live in by improving the basic needs of humans like housing, transportation, and utilities. These developments have made a huge impact on major environmental factors. Air pollution is one prominent environmental factor that needs to be addressed to maintain a sustainable and healthier lifestyle. Textiles have always been at the forefront of helping humans shield from environmental conditions. With the growth in the field of electronic textiles, we now have the capability of monitoring the atmosphere in real time to understand and analyze the environment that a particular person is mostly spending their time at. Integrating textiles with the particulate matter sensors that measure air quality and pollutants that have a direct impact on human health will help to understand what type of air we are breathing. This research idea aims to develop a textile product and a process of collecting the pollutants through particulate matter sensors, which are equipped inside a smart textile product and store the data to develop a machine learning model to analyze the health conditions of the person wearing the garment and periodically notifying them not only will help to be cautious of airborne diseases but will help to regulate the diseases and could also help to take care of skin conditions.

Keywords: air pollution, e-textiles, particulate matter sensors, environment, machine learning models

Procedia PDF Downloads 114
39960 Securing Health Monitoring in Internet of Things with Blockchain-Based Proxy Re-Encryption

Authors: Jerlin George, R. Chitra

Abstract:

The devices with sensors that can monitor your temperature, heart rate, and other vital signs and link to the internet, known as the Internet of Things (IoT), have completely transformed the way we control health. Providing real-time health data, these sensors improve diagnostics and treatment outcomes. Security and privacy matters when IoT comes into play in healthcare. Cyberattacks on centralized database systems are also a problem. To solve these challenges, the study uses blockchain technology coupled with proxy re-encryption to secure health data. ThingSpeak IoT cloud analyzes the collected data and turns them into blockchain transactions which are safely kept on the DriveHQ cloud. Transparency and data integrity are ensured by blockchain, and secure data sharing among authorized users is made possible by proxy re-encryption. This results in a health monitoring system that preserves the accuracy and confidentiality of data while reducing the safety risks of IoT-driven healthcare applications.

Keywords: internet of things, healthcare, sensors, electronic health records, blockchain, proxy re-encryption, data privacy, data security

Procedia PDF Downloads 16
39959 HTML5 Online Learning Application with Offline Web, Location Based, Animated Web, Multithread, and Real-Time Features

Authors: Sheetal R. Jadhwani, Daisy Sang, Chang-Shyh Peng

Abstract:

Web applications are an integral part of modem life. They are mostly based upon the HyperText Markup Language (HTML). While HTML meets the basic needs, there are some shortcomings. For example, applications can cease to work once user goes offline, real-time updates may be lagging, and user interface can freeze on computationally intensive tasks. The latest language specification HTML5 attempts to rectify the situation with new tools and protocols. This paper studies the new Web Storage, Geolocation, Web Worker, Canvas, and Web Socket APIs, and presents applications to test their features and efficiencies.

Keywords: HTML5, web worker, canvas, web socket

Procedia PDF Downloads 300
39958 The Optimal Irrigation in the Mitidja Plain

Authors: Gherbi Khadidja

Abstract:

In the Mediterranean region, water resources are limited and very unevenly distributed in space and time. The main objective of this project is the development of a wireless network for the management of water resources in northern Algeria, the Mitidja plain, which helps farmers to irrigate in the most optimized way and solve the problem of water shortage in the region. Therefore, we will develop an aid tool that can modernize and replace some traditional techniques, according to the real needs of the crops and according to the soil conditions as well as the climatic conditions (soil moisture, precipitation, characteristics of the unsaturated zone), These data are collected in real-time by sensors and analyzed by an algorithm and displayed on a mobile application and the website. The results are essential information and alerts with recommendations for action to farmers to ensure the sustainability of the agricultural sector under water shortage conditions. In the first part: We want to set up a wireless sensor network, for precise management of water resources, by presenting another type of equipment that allows us to measure the water content of the soil, such as the Watermark probe connected to the sensor via the acquisition card and an Arduino Uno, which allows collecting the captured data and then program them transmitted via a GSM module that will send these data to a web site and store them in a database for a later study. In a second part: We want to display the results on a website or a mobile application using the database to remotely manage our smart irrigation system, which allows the farmer to use this technology and offers the possibility to the growers to access remotely via wireless communication to see the field conditions and the irrigation operation, at home or at the office. The tool to be developed will be based on satellite imagery as regards land use and soil moisture. These tools will make it possible to follow the evolution of the needs of the cultures in time, but also to time, and also to predict the impact on water resources. According to the references consulted, if such a tool is used, it can reduce irrigation volumes by up to up to 40%, which represents more than 100 million m3 of savings per year for the Mitidja. This volume is equivalent to a medium-size dam.

Keywords: optimal irrigation, soil moisture, smart irrigation, water management

Procedia PDF Downloads 109
39957 Development of a Low-Cost Smart Insole for Gait Analysis

Authors: S. M. Khairul Halim, Mojtaba Ghodsi, Morteza Mohammadzaheri

Abstract:

Gait analysis is essential for diagnosing musculoskeletal and neurological conditions. However, current methods are often complex and expensive. This paper introduces a methodology for analysing gait parameters using a smart insole with a built-in accelerometer. The system measures stance time, swing time, step count, and cadence and wirelessly transmits data to a user-friendly IoT dashboard for centralized processing. This setup enables remote monitoring and advanced data analytics, making it a versatile tool for medical diagnostics and everyday usage. Integration with IoT enhances the portability and connectivity of the device, allowing for secure, encrypted data access over the Internet. This feature supports telemedicine and enables personalized treatment plans tailored to individual needs. Overall, the approach provides a cost-effective (almost 25 GBP), accurate, and user-friendly solution for gait analysis, facilitating remote tracking and customized therapy.

Keywords: gait analysis, IoT, smart insole, accelerometer sensor

Procedia PDF Downloads 17
39956 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 197
39955 Time Travel Testing: A Mechanism for Improving Renewal Experience

Authors: Aritra Majumdar

Abstract:

While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.

Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas

Procedia PDF Downloads 159
39954 Wavelet Based Signal Processing for Fault Location in Airplane Cable

Authors: Reza Rezaeipour Honarmandzad

Abstract:

Wavelet analysis is an exciting method for solving difficult problems in mathematics, physics, and engineering, with modern applications as diverse as wave propagation, data compression, signal processing, image processing, pattern recognition, etc. Wavelets allow complex information such as signals, images and patterns to be decomposed into elementary forms at different positions and scales and subsequently reconstructed with high precision. In this paper a wavelet-based signal processing algorithm for airplane cable fault location is proposed. An orthogonal discrete wavelet decomposition and reconstruction algorithm is used to eliminate the noise in the aircraft cable fault signal. The experiment result has shown that the character of emission pulse and reflect pulse used to test the aircraft cable fault point are reserved and the high-frequency noise are eliminated by means of the proposed algorithm in this paper.

Keywords: wavelet analysis, signal processing, orthogonal discrete wavelet, noise, aircraft cable fault signal

Procedia PDF Downloads 524
39953 Object Recognition System Operating from Different Type Vehicles Using Raspberry and OpenCV

Authors: Maria Pavlova

Abstract:

In our days, it is possible to put the camera on different vehicles like quadcopter, train, airplane and etc. The camera also can be the input sensor in many different systems. That means the object recognition like non separate part of monitoring control can be key part of the most intelligent systems. The aim of this paper is to focus of the object recognition process during vehicles movement. During the vehicle’s movement the camera takes pictures from the environment without storage in Data Base. In case the camera detects a special object (for example human or animal), the system saves the picture and sends it to the work station in real time. This functionality will be very useful in emergency or security situations where is necessary to find a specific object. In another application, the camera can be mounted on crossroad where do not have many people and if one or more persons come on the road, the traffic lights became the green and they can cross the road. In this papers is presented the system has solved the aforementioned problems. It is presented architecture of the object recognition system includes the camera, Raspberry platform, GPS system, neural network, software and Data Base. The camera in the system takes the pictures. The object recognition is done in real time using the OpenCV library and Raspberry microcontroller. An additional feature of this library is the ability to display the GPS coordinates of the captured objects position. The results from this processes will be sent to remote station. So, in this case, we can know the location of the specific object. By neural network, we can learn the module to solve the problems using incoming data and to be part in bigger intelligent system. The present paper focuses on the design and integration of the image recognition like a part of smart systems.

Keywords: camera, object recognition, OpenCV, Raspberry

Procedia PDF Downloads 218
39952 Numerical Implementation and Testing of Fractioning Estimator Method for the Box-Counting Dimension of Fractal Objects

Authors: Abraham Terán Salcedo, Didier Samayoa Ochoa

Abstract:

This work presents a numerical implementation of a method for estimating the box-counting dimension of self-avoiding curves on a planar space, fractal objects captured on digital images; this method is named fractioning estimator. Classical methods of digital image processing, such as noise filtering, contrast manipulation, and thresholding, among others, are used in order to obtain binary images that are suitable for performing the necessary computations of the fractioning estimator. A user interface is developed for performing the image processing operations and testing the fractioning estimator on different captured images of real-life fractal objects. To analyze the results, the estimations obtained through the fractioning estimator are compared to the results obtained through other methods that are already implemented on different available software for computing and estimating the box-counting dimension.

Keywords: box-counting, digital image processing, fractal dimension, numerical method

Procedia PDF Downloads 83
39951 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: database, forensic genetics, genetic analysis, sample management, software solution

Procedia PDF Downloads 370
39950 Design and Implementation of Neural Network Based Controller for Self-Driven Vehicle

Authors: Hassam Muazzam

Abstract:

This paper devises an autonomous self-driven vehicle that is capable of taking a disabled person to his/her desired location using three different power sources (gasoline, solar, electric) without any control from the user, avoiding the obstacles in the way. The GPS co-ordinates of the desired location are sent to the main processing board via a GSM module. After the GPS co-ordinates are sent, the path to be followed by the vehicle is devised by Pythagoras theorem. The distance and angle between the present location and the desired location is calculated and then the vehicle starts moving in the desired direction. Meanwhile real-time data from ultrasonic sensors is fed to the board for obstacle avoidance mechanism. Ultrasonic sensors are used to quantify the distance of the vehicle from the object. The distance and position of the object is then used to make decisions regarding the direction of vehicle in order to avoid the obstacles using artificial neural network which is implemented using ATmega1280. Also the vehicle provides the feedback location at remote location.

Keywords: autonomous self-driven vehicle, obstacle avoidance, desired location, pythagoras theorem, neural network, remote location

Procedia PDF Downloads 409
39949 Mobi-DiQ: A Pervasive Sensing System for Delirium Risk Assessment in Intensive Care Unit

Authors: Subhash Nerella, Ziyuan Guan, Azra Bihorac, Parisa Rashidi

Abstract:

Intensive care units (ICUs) provide care to critically ill patients in severe and life-threatening conditions. However, patient monitoring in the ICU is limited by the time and resource constraints imposed on healthcare providers. Many critical care indices such as mobility are still manually assessed, which can be subjective, prone to human errors, and lack granularity. Other important aspects, such as environmental factors, are not monitored at all. For example, critically ill patients often experience circadian disruptions due to the absence of effective environmental “timekeepers” such as the light/dark cycle and the systemic effect of acute illness on chronobiologic markers. Although the occurrence of delirium is associated with circadian disruption risk factors, these factors are not routinely monitored in the ICU. Hence, there is a critical unmet need to develop systems for precise and real-time assessment through novel enabling technologies. We have developed the mobility and circadian disruption quantification system (Mobi-DiQ) by augmenting biomarker and clinical data with pervasive sensing data to generate mobility and circadian cues related to mobility, nightly disruptions, and light and noise exposure. We hypothesize that Mobi-DiQ can provide accurate mobility and circadian cues that correlate with bedside clinical mobility assessments and circadian biomarkers, ultimately important for delirium risk assessment and prevention. The collected multimodal dataset consists of depth images, Electromyography (EMG) data, patient extremity movement captured by accelerometers, ambient light levels, Sound Pressure Level (SPL), and indoor air quality measured by volatile organic compounds, and the equivalent CO₂ concentration. For delirium risk assessment, the system recognizes mobility cues (axial body movement features and body key points) and circadian cues, including nightly disruptions, ambient SPL, and light intensity, as well as other environmental factors such as indoor air quality. The Mobi-DiQ system consists of three major components: the pervasive sensing system, a data storage and analysis server, and a data annotation system. For data collection, six local pervasive sensing systems were deployed, including a local computer and sensors. A video recording tool with graphical user interface (GUI) developed in python was used to capture depth image frames for analyzing patient mobility. All sensor data is encrypted, then automatically uploaded to the Mobi-DiQ server through a secured VPN connection. Several data pipelines are developed to automate the data transfer, curation, and data preparation for annotation and model training. The data curation and post-processing are performed on the server. A custom secure annotation tool with GUI was developed to annotate depth activity data. The annotation tool is linked to the MongoDB database to record the data annotation and to provide summarization. Docker containers are also utilized to manage services and pipelines running on the server in an isolated manner. The processed clinical data and annotations are used to train and develop real-time pervasive sensing systems to augment clinical decision-making and promote targeted interventions. In the future, we intend to evaluate our system as a clinical implementation trial, as well as to refine and validate it by using other data sources, including neurological data obtained through continuous electroencephalography (EEG).

Keywords: deep learning, delirium, healthcare, pervasive sensing

Procedia PDF Downloads 93
39948 Autism Disease Detection Using Transfer Learning Techniques: Performance Comparison between Central Processing Unit vs. Graphics Processing Unit Functions for Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

Neural network approaches are machine learning methods used in many domains, such as healthcare and cyber security. Neural networks are mostly known for dealing with image datasets. While training with the images, several fundamental mathematical operations are carried out in the Neural Network. The operation includes a number of algebraic and mathematical functions, including derivative, convolution, and matrix inversion and transposition. Such operations require higher processing power than is typically needed for computer usage. Central Processing Unit (CPU) is not appropriate for a large image size of the dataset as it is built with serial processing. While Graphics Processing Unit (GPU) has parallel processing capabilities and, therefore, has higher speed. This paper uses advanced Neural Network techniques such as VGG16, Resnet50, Densenet, Inceptionv3, Xception, Mobilenet, XGBOOST-VGG16, and our proposed models to compare CPU and GPU resources. A system for classifying autism disease using face images of an autistic and non-autistic child was used to compare performance during testing. We used evaluation matrices such as Accuracy, F1 score, Precision, Recall, and Execution time. It has been observed that GPU runs faster than the CPU in all tests performed. Moreover, the performance of the Neural Network models in terms of accuracy increases on GPU compared to CPU.

Keywords: autism disease, neural network, CPU, GPU, transfer learning

Procedia PDF Downloads 118
39947 Modelling Fluidization by Data-Based Recurrence Computational Fluid Dynamics

Authors: Varun Dongre, Stefan Pirker, Stefan Heinrich

Abstract:

Over the last decades, the numerical modelling of fluidized bed processes has become feasible even for industrial processes. Commonly, continuous two-fluid models are applied to describe large-scale fluidization. In order to allow for coarse grids novel two-fluid models account for unresolved sub-grid heterogeneities. However, computational efforts remain high – in the order of several hours of compute-time for a few seconds of real-time – thus preventing the representation of long-term phenomena such as heating or particle conversion processes. In order to overcome this limitation, data-based recurrence computational fluid dynamics (rCFD) has been put forward in recent years. rCFD can be regarded as a data-based method that relies on the numerical predictions of a conventional short-term simulation. This data is stored in a database and then used by rCFD to efficiently time-extrapolate the flow behavior in high spatial resolution. This study will compare the numerical predictions of rCFD simulations with those of corresponding full CFD reference simulations for lab-scale and pilot-scale fluidized beds. In assessing the predictive capabilities of rCFD simulations, we focus on solid mixing and secondary gas holdup. We observed that predictions made by rCFD simulations are highly sensitive to numerical parameters such as diffusivity associated with face swaps. We achieved a computational speed-up of four orders of magnitude (10,000 time faster than classical TFM simulation) eventually allowing for real-time simulations of fluidized beds. In the next step, we apply the checkerboarding technique by introducing gas tracers subjected to convection and diffusion. We then analyze the concentration profiles by observing mixing, transport of gas tracers, insights about the convective and diffusive pattern of the gas tracers, and further towards heat and mass transfer methods. Finally, we run rCFD simulations and calibrate them with numerical and physical parameters compared with convectional Two-fluid model (full CFD) simulation. As a result, this study gives a clear indication of the applicability, predictive capabilities, and existing limitations of rCFD in the realm of fluidization modelling.

Keywords: multiphase flow, recurrence CFD, two-fluid model, industrial processes

Procedia PDF Downloads 75
39946 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models

Authors: Manisha Mukherjee, Diptarka Saha

Abstract:

Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.

Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function

Procedia PDF Downloads 165
39945 Comparison of Real-Time PCR and FTIR with Chemometrics Technique in Analysing Halal Supplement Capsules

Authors: Mohd Sukri Hassan, Ahlam Inayatullah Badrul Munir, M. Husaini A. Rahman

Abstract:

Halal authentication and verification in supplement capsules are highly required as the gelatine available in the market can be from halal or non-halal sources. It is an obligation for Muslim to consume and use the halal consumer goods. At present, real-time polymerase chain reaction (RT-PCR) is the most common technique being used for the detection of porcine and bovine DNA in gelatine due to high sensitivity of the technique and higher stability of DNA compared to protein. In this study, twenty samples of supplements capsules from different products with different Halal logos were analyzed for porcine and bovine DNA using RT-PCR. Standard bovine and porcine gelatine from eurofins at a range of concentration from 10-1 to 10-5 ng/µl were used to determine the linearity range, limit of detection and specificity on RT-PCR (SYBR Green method). RT-PCR detected porcine (two samples), bovine (four samples) and mixture of porcine and bovine (six samples). The samples were also tested using FT-IR technique where normalized peak of IR spectra were pre-processed using Savitsky Golay method before Principal Components Analysis (PCA) was performed on the database. Scores plot of PCA shows three clusters of samples; bovine, porcine and mixture (bovine and porcine). The RT-PCR and FT-IR with chemometrics technique were found to give same results for porcine gelatine samples which can be used for Halal authentication.

Keywords: halal, real-time PCR, gelatine, chemometrics

Procedia PDF Downloads 241
39944 Elevated Temperature Shot Peening for M50 Steel

Authors: Xinxin Ma, Guangze Tang, Shuxin Yang, Jinguang He, Fan Zhang, Peiling Sun, Ming Liu, Minyu Sun, Liqin Wang

Abstract:

As a traditional surface hardening technique, shot peening is widely used in industry. By using shot peening, a residual compressive stress is formed in the surface which is beneficial for improving the fatigue life of metal materials. At the same time, very fine grains and high density defects are generated in the surface layer which enhances the surface hardness, either. However, most of the processes are carried out at room temperature. For high strength steel, such as M50, the thickness of the strengthen layer is limited. In order to obtain a thick strengthen surface layer, elevated temperature shot peening was carried out in this work by using Φ1mm cast ion balls with a speed of 80m/s. Considering the tempering temperature of M50 steel is about 550 oC, the processing temperature was in the range from 300 to 500 oC. The effect of processing temperature and processing time of shot peening on distribution of residual stress and surface hardness was investigated. As we known, the working temperature of M50 steel can be as high as 315 oC. Because the defects formed by shot peening are unstable when the working temperature goes higher, it is worthy to understand what happens during the shot peening process, and what happens when the strengthen samples were kept at a certain temperature. In our work, the shot peening time was selected from 2 to 10 min. And after the strengthening process, the samples were annealed at various temperatures from 200 to 500 oC up to 60 h. The results show that the maximum residual compressive stress is near 900 MPa. Compared with room temperature shot peening, the strengthening depth of 500 oC shot peening sample is about 2 times deep. The surface hardness increased with the processing temperature, and the saturation peening time decreases. After annealing, the residual compressive stress decreases, however, for 500 oC peening sample, even annealing at 500 oC for 20 h, the residual compressive stress is still over 600 MPa. However, it is clean to see from SEM that the grain size of surface layers is still very small.

Keywords: shot peening, M50 steel, residual compressive stress, elevated temperature

Procedia PDF Downloads 456
39943 A New Study on Mathematical Modelling of COVID-19 with Caputo Fractional Derivative

Authors: Sadia Arshad

Abstract:

The new coronavirus disease or COVID-19 still poses an alarming situation around the world. Modeling based on the derivative of fractional order is relatively important to capture real-world problems and to analyze the realistic situation of the proposed model. Weproposed a mathematical model for the investigation of COVID-19 dynamics in a generalized fractional framework. The new model is formulated in the Caputo sense and employs a nonlinear time-varying transmission rate. The existence and uniqueness solutions of the fractional order derivative have been studied using the fixed-point theory. The associated dynamical behaviors are discussed in terms of equilibrium, stability, and basic reproduction number. For the purpose of numerical implementation, an effcient approximation scheme is also employed to solve the fractional COVID-19 model. Numerical simulations are reported for various fractional orders, and simulation results are compared with a real case of COVID-19 pandemic. According to the comparative results with real data, we find the best value of fractional orderand justify the use of the fractional concept in the mathematical modelling, for the new fractional modelsimulates the reality more accurately than the other classical frameworks.

Keywords: fractional calculus, modeling, stability, numerical solution

Procedia PDF Downloads 111
39942 Networked Radar System to Increase Safety of Urban Railroad Crossing

Authors: Sergio Saponara, Luca Fanucci, Riccardo Cassettari, Ruggero Piernicola, Marco Righetto

Abstract:

The paper presents an innovative networked radar system for detection of obstacles in a railway level crossing scenario. This Monitoring System (MS) is able to detect moving or still obstacles within the railway level crossing area automatically, avoiding the need of human presence for surveillance. The MS is also connected to the National Railway Information and Signaling System to communicate in real-time the level crossing status. The architecture is compliant with the highest Safety Integrity Level (SIL4) of the CENELEC standard. The number of radar sensors used is configurable at set-up time and depends on how large the level crossing area can be. At least two sensors are expected and up four can be used for larger areas. The whole processing chain that elaborates the output sensor signals, as well as the communication interface, is fully-digital, was designed in VHDL code and implemented onto a Xilinx Virtex 6.

Keywords: radar for safe mobility, railroad crossing, railway, transport safety

Procedia PDF Downloads 480
39941 Hydrodynamic Analysis of Fish Fin Kinematics of Oreochromis Niloticus Using Machine Learning and Image Processing

Authors: Paramvir Singh

Abstract:

The locomotion of aquatic organisms has long fascinated biologists and engineers alike, with fish fins serving as a prime example of nature's remarkable adaptations for efficient underwater propulsion. This paper presents a comprehensive study focused on the hydrodynamic analysis of fish fin kinematics, employing an innovative approach that combines machine learning and image processing techniques. Through high-speed videography and advanced computational tools, we gain insights into the complex and dynamic motion of the fins of a Tilapia (Oreochromis Niloticus) fish. This study was initially done by experimentally capturing videos of the various motions of a Tilapia in a custom-made setup. Using deep learning and image processing on the videos, the motion of the Caudal and Pectoral fin was extracted. This motion included the fin configuration (i.e., the angle of deviation from the mean position) with respect to time. Numerical investigations for the flapping fins are then performed using a Computational Fluid Dynamics (CFD) solver. 3D models of the fins were created, mimicking the real-life geometry of the fins. Thrust Characteristics of separate fins (i.e., Caudal and Pectoral separately) and when the fins are together were studied. The relationship and the phase between caudal and pectoral fin motion were also discussed. The key objectives include mathematical modeling of the motion of a flapping fin at different naturally occurring frequencies and amplitudes. The interactions between both fins (caudal and pectoral) were also an area of keen interest. This work aims to improve on research that has been done in the past on similar topics. Also, these results can help in the better and more efficient design of the propulsion systems for biomimetic underwater vehicles that are used to study aquatic ecosystems, explore uncharted or challenging underwater regions, do ocean bed modeling, etc.

Keywords: biomimetics, fish fin kinematics, image processing, fish tracking, underwater vehicles

Procedia PDF Downloads 89
39940 A Value-Oriented Metamodel for Small and Medium Enterprises’ Decision Making

Authors: Romain Ben Taleb, Aurélie Montarnal, Matthieu Lauras, Mathieu Dahan, Romain Miclo

Abstract:

To be competitive and sustainable, any company has to maximize its value. However, unlike listed companies that can assess their values based on market shares, most Small and Medium Enterprises (SMEs) which are non-listed cannot have direct and live access to this critical information. Traditional accounting reports only give limited insights to SME decision-makers about the real impact of their day-to-day decisions on the company’s performance and value. Most of the time, an SME’s financial valuation is made one time a year as the associated process is time and resource-consuming, requiring several months and external expertise to be completed. To solve this issue, we propose in this paper a value-oriented metamodel that enables real-time and dynamic assessment of the SME’s value based on the large definition of their assets. These assets cover a wider scope of resources of the company and better account for immaterial assets. The proposal, which is illustrated in a case study, discusses the benefits of incorporating assets in the SME valuation.

Keywords: SME, metamodel, decision support system, financial valuation, assets

Procedia PDF Downloads 92
39939 Impact of Digitized Monitoring & Evaluation System in Technical Vocational Education and Training

Authors: Abdul Ghani Rajput

Abstract:

Although monitoring and evaluation concept adopted by Technical Vocational Education and Training (TVET) organization to track the progress over the continuous interval of time based on planned interventions and subsequently, evaluating it for the impact, quality assurance and sustainability. In digital world, TVET providers are giving preference to have real time information to do monitoring of training activities. Identifying the benefits and challenges of digitized monitoring & evaluation real time information system has not been sufficiently tackled in this date. This research paper looks at the impact of digitized M&E in TVET sector by analyzing two case studies and describe the benefits and challenges of using digitized M&E system. Finally, digitized M&E have been identified as carriers for high potential of TVET sector.

Keywords: digitized M&E, innovation, quality assurance, TVET

Procedia PDF Downloads 230
39938 The Role of Logistics Services in Influencing Customer Satisfaction and Reviews in an Online Marketplace

Authors: nafees mahbub, blake tindol, utkarsh shrivastava, kuanchin chen

Abstract:

Online shopping has become an integral part of businesses today. Big players such as Amazon are setting the bar for delivery services, and many businesses are working towards meeting them. However, what happens if a seller underestimates or overestimates the delivery time? Does it translate to consumer comments, ratings, or lost sales? Although several prior studies have investigated the impact of poor logistics on customer satisfaction, that impact of under estimation of delivery times has been rarely considered. The study uses real-time customer online purchase data to study the impact of missed delivery times on satisfaction.

Keywords: LOST SALES, DELIVERY TIME, CUSTOMER SATISFACTION, CUSTOMER REVIEWS

Procedia PDF Downloads 214