Search results for: real time data processing
39301 Filtering and Reconstruction System for Grey-Level Forensic Images
Authors: Ahd Aljarf, Saad Amin
Abstract:
Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.Keywords: image filtering, image reconstruction, image processing, forensic images
Procedia PDF Downloads 36639300 Indian Road Traffic Flow Analysis Using Blob Tracking from Video Sequences
Authors: Balaji Ganesh Rajagopal, Subramanian Appavu alias Balamurugan, Ayyalraj Midhun Kumar, Krishnan Nallaperumal
Abstract:
Intelligent Transportation System is an Emerging area to solve multiple transportation problems. Several forms of inputs are needed in order to solve ITS problems. Advanced Traveler Information System (ATIS) is a core and important ITS area of this modern era. This involves travel time forecasting, efficient road map analysis and cost based path selection, Detection of the vehicle in the dynamic conditions and Traffic congestion state forecasting. This Article designs and provides an algorithm for traffic data generation which can be used for the above said ATIS application. By inputting the real world traffic situation in the form of video sequences, the algorithm determines the Traffic density in terms of congestion, number of vehicles in a given path which can be fed for various ATIS applications. The Algorithm deduces the key frame from the video sequences and follows the Blob detection, Identification and Tracking using connected components algorithm to determine the correlation between the vehicles moving in the real road scene.Keywords: traffic transportation, traffic density estimation, blob identification and tracking, relative velocity of vehicles, correlation between vehicles
Procedia PDF Downloads 51039299 A Multi Sensor Monochrome Video Fusion Using Image Quality Assessment
Authors: M. Prema Kumar, P. Rajesh Kumar
Abstract:
The increasing interest in image fusion (combining images of two or more modalities such as infrared and visible light radiation) has led to a need for accurate and reliable image assessment methods. This paper gives a novel approach of merging the information content from several videos taken from the same scene in order to rack up a combined video that contains the finest information coming from different source videos. This process is known as video fusion which helps in providing superior quality (The term quality, connote measurement on the particular application.) image than the source images. In this technique different sensors (whose redundant information can be reduced) are used for various cameras that are imperative for capturing the required images and also help in reducing. In this paper Image fusion technique based on multi-resolution singular value decomposition (MSVD) has been used. The image fusion by MSVD is almost similar to that of wavelets. The idea behind MSVD is to replace the FIR filters in wavelet transform with singular value decomposition (SVD). It is computationally very simple and is well suited for real time applications like in remote sensing and in astronomy.Keywords: multi sensor image fusion, MSVD, image processing, monochrome video
Procedia PDF Downloads 57239298 Study of Inhibition of the End Effect Based on AR Model Predict of Combined Data Extension and Window Function
Authors: Pan Hongxia, Wang Zhenhua
Abstract:
In this paper, the EMD decomposition in the process of endpoint effect adopted data based on AR model to predict the continuation and window function method of combining the two effective inhibition. Proven by simulation of the simulation signal obtained the ideal effect, then, apply this method to the gearbox test data is also achieved good effect in the process, for the analysis of the subsequent data processing to improve the calculation accuracy. In the end, under various working conditions for the gearbox fault diagnosis laid a good foundation.Keywords: gearbox, fault diagnosis, ar model, end effect
Procedia PDF Downloads 36639297 Conceptualizing IoT Based Framework for Enhancing Environmental Accounting By ERP Systems
Authors: Amin Ebrahimi Ghadi, Morteza Moalagh
Abstract:
This research is carried out to find how a perfect combination of IoT architecture (Internet of Things) and ERP system can strengthen environmental accounting to incorporate both economic and environmental information. IoT (e.g., sensors, software, and other technologies) can be used in the company’s value chain from raw material extraction through materials processing, manufacturing products, distribution, use, repair, maintenance, and disposal or recycling products (Cradle to Grave model). The desired ERP software then will have the capability to track both midpoint and endpoint environmental impacts on a green supply chain system for the whole life cycle of a product. All these enable environmental accounting to calculate, and real-time analyze the operation environmental impacts, control costs, prepare for environmental legislation and enhance the decision-making process. In this study, we have developed a model on how to use IoT devices in life cycle assessment (LCA) to gather emissions, energy consumption, hazards, and wastes information to be processed in different modules of ERP systems in an integrated way for using in environmental accounting to achieve sustainability.Keywords: ERP, environmental accounting, green supply chain, IOT, life cycle assessment, sustainability
Procedia PDF Downloads 17239296 Empirical Acceleration Functions and Fuzzy Information
Authors: Muhammad Shafiq
Abstract:
In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data
Procedia PDF Downloads 29839295 Construction Innovation: Support for 3D Printing House
Authors: Andrea Palazzo, Daniel Macek, Veronika Malinova
Abstract:
Contour processing is the new technology challenge for architects and construction companies. The many advantages it promises make it one of the most interesting solutions for construction in terms of automation of building processes. The technology for 3D printing houses offers many application possibilities, from low-cost construction, to being considered by NASA for visionary projects as a good solution for building settlements on other planets. Another very important point is that clients, as architects, will no longer have many limits in design concerning ideas and creativity. The prices for real estate are constantly increasing and the lack of availability of construction materials as well as the speculation that has been created around it in 2021 is bringing prices to such a level that in the future real estate developers risk not being able to find customers for these ultra-expensive homes. Hence, this paper starts with the introduction of 3D printing, which now has the potential to gain an important position in the market, becoming a valid alternative to the classic construction process. This technology is not only beneficial from an economic point of view but it is also a great opportunity to have an impact on the environment by reducing CO2 emissions. Further on in the article we will also understand if, after the COP 26 (2021 United Nations Climate Change Conference), world governments could also push towards building technologies that reduce the waste materials that are needed to be disposed of and at the same time reduce emissions with the contribution of governmental funds. This paper will give us insight on the multiple benefits of 3D printing and emphasise the importance of finding new solutions for materials that can be used by the printer. Therefore, based on the type of material, it will be possible to understand the compatibility with current regulations and how the authorities will be inclined to support this technology. This will help to enable the rise and development of this technology in Europe and in the rest of the world on actual housing projects and not only on prototypes.Keywords: additive manufacturing, contour crafting, development, new regulation, printing material
Procedia PDF Downloads 19839294 Spatial Integrity of Seismic Data for Oil and Gas Exploration
Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof
Abstract:
Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow
Procedia PDF Downloads 22339293 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods
Authors: Devatha Kalyan Kumar, R. Poovarasan
Abstract:
In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric
Procedia PDF Downloads 25639292 Scheduling in Cloud Networks Using Chakoos Algorithm
Authors: Masoumeh Ali Pouri, Hamid Haj Seyyed Javadi
Abstract:
Nowadays, cloud processing is one of the important issues in information technology. Since scheduling of tasks graph is an NP-hard problem, considering approaches based on undeterminisitic methods such as evolutionary processing, mostly genetic and cuckoo algorithms, will be effective. Therefore, an efficient algorithm has been proposed for scheduling of tasks graph to obtain an appropriate scheduling with minimum time. In this algorithm, the new approach is based on making the length of the critical path shorter and reducing the cost of communication. Finally, the results obtained from the implementation of the presented method show that this algorithm acts the same as other algorithms when it faces graphs without communication cost. It performs quicker and better than some algorithms like DSC and MCP algorithms when it faces the graphs involving communication cost.Keywords: cloud computing, scheduling, tasks graph, chakoos algorithm
Procedia PDF Downloads 6539291 Brief Guide to Cloud-Based AI Prototyping: Key Insights from Selected Case Studies Using Google Cloud Platform
Authors: Kamellia Reshadi, Pranav Ragji, Theodoros Soldatos
Abstract:
Recent advancements in cloud computing and storage, along with rapid progress in artificial intelligence (AI), have transformed approaches to developing efficient, scalable applications. However, integrating AI with cloud computing poses challenges as these fields are often disjointed, and many advancements remain difficult to access, obscured in complex documentation or scattered across research reports. For this reason, we share experiences from prototype projects combining these technologies. Specifically, we focus on Google Cloud Platform (GCP) functionalities and describe vision and speech activities applied to labeling, subtitling, and urban traffic flow tasks. We describe challenges, pricing, architecture, and other key features, considering the goal of real-time performance. We hope our demonstrations provide not only essential guidelines for using these functionalities but also enable more similar approaches.Keywords: artificial intelligence, cloud computing, real-time applications, case studies, knowledge management, research and development, text labeling, video annotation, urban traffic analysis, public safety, prototyping, Google Cloud Platform
Procedia PDF Downloads 1239290 A Process for Prevention of Browning in Fresh Cut Tender Jackfruit
Authors: Ramachandra Pradhan, Sandeep Singh Rama, Sabyasachi Mishra
Abstract:
Jackfruit (Artocarpus heterophyllus L.) in its tender form is consumed as a vegetable and popular for its flavour, colour and meat like texture. In South Asian countries like Bangladesh, India, Pakistan and Indonesia the market value for tender jackfruit is very high. However, due to lack of technology the marketing and transportation of the fruit is a challenge. The processing activities like washing, sorting, peeling and cutting enhances oxidative stress in fresh cut jackfruit. It is also having the ill effects on quality of fresh cut tender jackfruit by an increase in microbial contaminations, excessive tissue softening, and depletion of phytochemicals and browning. Hence, this study was conducted as a solution to the above problem. Fresh cut tender Jackfruit slices were processed by using the independent parameters such as concentration of CaCl2 (2-5%), concentration of citric acid (1-2.5%) and treatment time (4-10 min.) and the depended variables were Browning index (BI), colour change (ΔE), Firmness (F) and Overall all acceptability (OAA) after the treatment. From the response variables the best combination of independent variables was resulted as 3% concentration of CaCl2 and 2% concentration of citric acid for 6 minutes. At these optimised processing treatments, the browning can be prevented for fresh cut tender jackfruit. This technology can be used by the researcher, scientists, industries, etc. for further processing of tender jackfruit.Keywords: tender jackfruit, browning index, firmness, texture
Procedia PDF Downloads 25839289 Learn through AR (Augmented Reality)
Authors: Prajakta Musale, Bhargav Parlikar, Sakshi Parkhi, Anshu Parihar, Aryan Parikh, Diksha Parasharam, Parth Jadhav
Abstract:
AR technology is basically a development of VR technology that harnesses the power of computers to be able to read the surroundings and create projections of digital models in the real world for the purpose of visualization, demonstration, and education. It has been applied to education, fields of prototyping in product design, development of medical models, battle strategy in the military and many other fields. Our Engineering Design and Innovation (EDAI) project focuses on the usage of augmented reality, visual mapping, and 3d-visualization along with animation and text boxes to help students in fields of education get a rough idea of the concepts such as flow and mechanical movements that may be hard to visualize at first glance.Keywords: spatial mapping, ARKit, depth sensing, real-time rendering
Procedia PDF Downloads 6339288 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova
Abstract:
The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.Keywords: bacteriocins, cross-contamination, mathematical model, temperature
Procedia PDF Downloads 14439287 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method
Procedia PDF Downloads 50239286 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ
Authors: M. Khaled Abduesslam, Mohammed Ali, Basher H. Alsdai, Muhammad Nizam Inayati
Abstract:
This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New-England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.Keywords: IEEE 39 bus, least squares support vector machine, learning vector quantization, voltage collapse
Procedia PDF Downloads 44239285 Hands-off Parking: Deep Learning Gesture-based System for Individuals with Mobility Needs
Authors: Javier Romera, Alberto Justo, Ignacio Fidalgo, Joshue Perez, Javier Araluce
Abstract:
Nowadays, individuals with mobility needs face a significant challenge when docking vehicles. In many cases, after parking, they encounter insufficient space to exit, leading to two undesired outcomes: either avoiding parking in that spot or settling for improperly placed vehicles. To address this issue, the following paper presents a parking control system employing gestural teleoperation. The system comprises three main phases: capturing body markers, interpreting gestures, and transmitting orders to the vehicle. The initial phase is centered around the MediaPipe framework, a versatile tool optimized for real-time gesture recognition. MediaPipe excels at detecting and tracing body markers, with a special emphasis on hand gestures. Hands detection is done by generating 21 reference points for each hand. Subsequently, after data capture, the project employs the MultiPerceptron Layer (MPL) for indepth gesture classification. This tandem of MediaPipe's extraction prowess and MPL's analytical capability ensures that human gestures are translated into actionable commands with high precision. Furthermore, the system has been trained and validated within a built-in dataset. To prove the domain adaptation, a framework based on the Robot Operating System (ROS), as a communication backbone, alongside CARLA Simulator, is used. Following successful simulations, the system is transitioned to a real-world platform, marking a significant milestone in the project. This real vehicle implementation verifies the practicality and efficiency of the system beyond theoretical constructs.Keywords: gesture detection, mediapipe, multiperceptron layer, robot operating system
Procedia PDF Downloads 10039284 Optical Properties of TlInSe₂<AU> Si̇ngle Crystals
Authors: Gulshan Mammadova
Abstract:
This paper presents the results of studying the surface microrelief in 2D and 3D models and analyzing the spectroscopy of a three-junction TlInSe₂Keywords: optical properties, dielectric permittivity, real and imaginary dielectric permittivity, optical electrical conductivity
Procedia PDF Downloads 6339283 The Predictive Value of Micro Rna 451 on the Outcome of Imatinib Treatment in Chronic Myeloid Leukemia Patients
Authors: Nehal Adel Khalil, Amel Foad Ketat, Fairouz Elsayed Mohamed Ali, Nahla Abdelmoneim Hamid, Hazem Farag Manaa
Abstract:
Background: Chronic myeloid leukemia (CML) represents 15% of adult leukemias. Imatinib Mesylate (IM) is the gold standard treatment for new cases of CML. Treatment with IM results in improvement of the majority of cases. However, about 25% of cases may develop resistance. Sensitive and specific early predictors of IM resistance in CML patients have not been established to date. Aim: To investigate the value of miR-451 in CML as an early predictor for IM resistance in Egyptian CML patients. Methods: The study employed Real time Polymerase Reaction (qPCR) technique to investigate the leucocytic expression of miR-451 in fifteen newly diagnosed CML patients (group I), fifteen IM responder CML patients (group II), fifteen IM resistant CML patients (group III) and fifteen healthy subjects of matched age and sex as a control group (group IV). The response to IM was defined as < 10% BCR-ABL transcript level after 3 months of therapy. The following parameters were assessed in subjects of all the studied groups: 1- Complete blood count (CBC). 2- Measurement of plasma level of miRNA 451 using real-time Polymerase Chain Reaction (qPCR). 3- Detection of BCR-ABL gene mutation in CML using qPCR. Results: The present study revealed that miR-451 was significantly down-regulated in leucocytes of newly diagnosed CML patients as compared to healthy subjects. IM responder CML patients showed an up-regulation of miR- 451 compared with IM resistant CML patients. Conclusion: According to the data from the present study, it can be concluded that leucocytic miR- 451 expression is a useful additional follow-up marker for the response to IM and a promising prognostic biomarker for CML.Keywords: chronic myeloid leukemia, imatinib resistance, microRNA 451, Polymerase Chain Reaction
Procedia PDF Downloads 29439282 Electrical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: electrical disaggregation, DTW, general appliance modeling, event detection
Procedia PDF Downloads 7839281 Study of the Vertical Handoff in Heterogeneous Networks and Implement Based on Opnet
Authors: Wafa Benaatou, Adnane Latif
Abstract:
In this document we studied more in detail the Performances of the vertical handover in the networks WLAN, WiMAX, UMTS before studying of it the Procedure of Handoff Vertical, the whole buckled by simulations putting forward the performances of the handover in the heterogeneous networks. The goal of Vertical Handover is to carry out several accesses in real-time in the heterogeneous networks. This makes it possible a user to use several networks (such as WLAN UMTS and WiMAX) in parallel, and the system to commutate automatically at another basic station, without disconnecting itself, as if there were no cut and with little loss of data as possible.Keywords: vertical handoff, WLAN, UMTS, WIMAX, heterogeneous
Procedia PDF Downloads 39439280 Traffic Congestion Analysis and Modeling for Urban Roads of Srinagar City
Authors: Adinarayana Badveeti, Mohammad Shafi Mir
Abstract:
In Srinagar City, in India, traffic congestion is a condition on transport networks that occurs as use increases and is characterized by slower speeds, longer trip times, and increased vehicular queuing. Traffic congestion is conventionally measured using indicators such as roadway level-of-service, the Travel Time Index and their variants. Several measures have been taken in order to counteract congestion like road pricing, car pooling, improved traffic management, etc. While new road construction can temporarily relieve congestion in the longer term, it simply encourages further growth in car traffic through increased travel and a switch away from public transport. The full paper report, on which this abstract is based, aims to provide policymakers and technical staff with the real-time data, conceptual framework and guidance on some of the engineering tools necessary to manage congestion in such a way as to reduce its overall impact on individuals, families, communities, and societies dynamic, affordable, liveable and attractive urban regions will never be free of congestion. Road transport policies, however, should seek to manage congestion on a cost-effective basis with the aim of reducing the burden that excessive congestion imposes upon travellers and urban dwellers throughout the urban road network.Keywords: traffic congestion, modeling, traffic management, travel time index
Procedia PDF Downloads 31939279 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning
Authors: Pei Yi Lin
Abstract:
Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model
Procedia PDF Downloads 7639278 Applying a Noise Reduction Method to Reveal Chaos in the River Flow Time Series
Authors: Mohammad H. Fattahi
Abstract:
Chaotic analysis has been performed on the river flow time series before and after applying the wavelet based de-noising techniques in order to investigate the noise content effects on chaotic nature of flow series. In this study, 38 years of monthly runoff data of three gauging stations were used. Gauging stations were located in Ghar-e-Aghaj river basin, Fars province, Iran. The noise level of time series was estimated with the aid of Gaussian kernel algorithm. This step was found to be crucial in preventing removal of the vital data such as memory, correlation and trend from the time series in addition to the noise during de-noising process.Keywords: chaotic behavior, wavelet, noise reduction, river flow
Procedia PDF Downloads 46839277 An Experimental Study of Scalar Implicature Processing in Chinese
Authors: Liu Si, Wang Chunmei, Liu Huangmei
Abstract:
A prominent component of the semantic versus pragmatic debate, scalar implicature (SI) has been gaining great attention ever since it was proposed by Horn. The constant debate is between the structural and pragmatic approach. The former claims that generation of SI is costless, automatic, and dependent mostly on the structural properties of sentences, whereas the latter advocates both that such generation is largely dependent upon context, and that the process is costly. Many experiments, among which Katsos’s text comprehension experiments are influential, have been designed and conducted in order to verify their views, but the results are not conclusive. Besides, most of the experiments were conducted in English language materials. Katsos conducted one off-line and three on-line text comprehension experiments, in which the previous shortcomings were addressed on a certain extent and the conclusion was in favor of the pragmatic approach. We intend to test the results of Katsos’s experiment in Chinese scalar implicature. Four experiments in both off-line and on-line conditions to examine the generation and response time of SI in Chinese "yixie" (some) and "quanbu (dou)" (all) will be conducted in order to find out whether the structural or the pragmatic approach could be sustained. The study mainly aims to answer the following questions: (1) Can SI be generated in the upper- and lower-bound contexts as Katsos confirmed when Chinese language materials are used in the experiment? (2) Can SI be first generated, then cancelled as default view claimed or can it not be generated in a neutral context when Chinese language materials are used in the experiment? (3) Is SI generation costless or costly in terms of processing resources? (4) In line with the SI generation process, what conclusion can be made about the cognitive processing model of language meaning? Is it a parallel model or a linear model? Or is it a dynamic and hierarchical model? According to previous theoretical debates and experimental conflicts, presumptions could be made that SI, in Chinese language, might be generated in the upper-bound contexts. Besides, the response time might be faster in upper-bound than that found in lower-bound context. SI generation in neutral context might be the slowest. At last, a conclusion would be made that the processing model of SI could not be verified by either absolute structural or pragmatic approaches. It is, rather, a dynamic and complex processing mechanism, in which the interaction of language forms, ad hoc context, mental context, background knowledge, speakers’ interaction, etc. are involved.Keywords: cognitive linguistics, pragmatics, scalar implicture, experimental study, Chinese language
Procedia PDF Downloads 36139276 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach
Authors: Kristina Pflug, Markus Busch
Abstract:
Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology
Procedia PDF Downloads 12539275 A Real-Time Snore Detector Using Neural Networks and Selected Sound Features
Authors: Stelios A. Mitilineos, Nicolas-Alexander Tatlas, Georgia Korompili, Lampros Kokkalas, Stelios M. Potirakis
Abstract:
Obstructive Sleep Apnea Hypopnea Syndrome (OSAHS) is a widespread chronic disease that mostly remains undetected, mainly due to the fact that it is diagnosed via polysomnography which is a time and resource-intensive procedure. Screening the disease’s symptoms at home could be used as an alternative approach in order to alert individuals that potentially suffer from OSAHS without compromising their everyday routine. Since snoring is usually linked to OSAHS, developing a snore detector is appealing as an enabling technology for screening OSAHS at home using ubiquitous equipment like commodity microphones (included in, e.g., smartphones). In this context, this study developed a snore detection tool and herein present the approach and selection of specific sound features that discriminate snoring vs. environmental sounds, as well as the performance of the proposed tool. Furthermore, a Real-Time Snore Detector (RTSD) is built upon the snore detection tool and employed in whole-night sleep sound recordings resulting to a large dataset of snoring sound excerpts that are made freely available to the public. The RTSD may be used either as a stand-alone tool that offers insight to an individual’s sleep quality or as an independent component of OSAHS screening applications in future developments.Keywords: obstructive sleep apnea hypopnea syndrome, apnea screening, snoring detection, machine learning, neural networks
Procedia PDF Downloads 20739274 Status of Sensory Profile Score among Children with Autism in Selected Centers of Dhaka City
Authors: Nupur A. D., Miah M. S., Moniruzzaman S. K.
Abstract:
Autism is a neurobiological disorder that affects physical, social, and language skills of a person. A child with autism feels difficulty for processing, integrating, and responding to sensory stimuli. Current estimates have shown that 45% to 96 % of children with Autism Spectrum Disorder demonstrate sensory difficulties. As autism is a worldwide burning issue, it has become a highly prioritized and important service provision in Bangladesh. The sensory deficit does not only hamper the normal development of a child, it also hampers the learning process and functional independency. The purpose of this study was to find out the prevalence of sensory dysfunction among children with autism and recognize common patterns of sensory dysfunction. A cross-sectional study design was chosen to carry out this research work. This study enrolled eighty children with autism and their parents by using the systematic sampling method. In this study, data were collected through the Short Sensory Profile (SSP) assessment tool, which consists of 38 items in the questionnaire, and qualified graduate Occupational Therapists were directly involved in interviewing parents as well as observing child responses to sensory related activities of the children with autism from four selected autism centers in Dhaka, Bangladesh. All item analyses were conducted to identify items yielding or resulting in the highest reported sensory processing dysfunction among those children through using SSP and Statistical Package for Social Sciences (SPSS) version 21.0 for data analysis. This study revealed that almost 78.25% of children with autism had significant sensory processing dysfunction based on their sensory response to relevant activities. Under-responsive sensory seeking and auditory filtering were the least common problems among them. On the other hand, most of them (95%) represented that they had definite to probable differences in sensory processing, including under-response or sensory seeking, auditory filtering, and tactile sensitivity. Besides, the result also shows that the definite difference in sensory processing among 64 children was within 100%; it means those children with autism suffered from sensory difficulties, and thus it drew a great impact on the children’s Daily Living Activities (ADLs) as well as social interaction with others. Almost 95% of children with autism require intervention to overcome or normalize the problem. The result gives insight regarding types of sensory processing dysfunction to consider during diagnosis and ascertaining the treatment. So, early sensory problem identification is very important and thus will help to provide appropriate sensory input to minimize the maladaptive behavior and enhance to reach the normal range of adaptive behavior.Keywords: autism, sensory processing difficulties, sensory profile, occupational therapy
Procedia PDF Downloads 6539273 How Hormesis Impacts Practice of Ecological Risk Assessment and Food Safety Assessment
Authors: Xiaoxian Zhang
Abstract:
Guidelines of ecological risk assessment (ERA) and food safety assessment (FSA) used nowadays, based on an S-shaped threshold dose-response curve (SDR), fail to consider hormesis, a reproducible biphasic dose-response model represented as a J-shaped or an inverted U-shaped curve, that occurs in the real-life environment across multitudinous compounds on cells, organisms, populations, and even the ecosystem. Specifically, in SDR-based ERA and FSA practice, predicted no effect concentration (PNEC) is calculated separately for individual substances from no observed effect concentration (NOEC, usually equivalent to 10% effect concentration (EC10) of a contaminant or food condiment) over an assessment coefficient that is bigger than 1. Experienced researchers doubted that hormesis in the real-life environment might lead to a waste of limited human and material resources in ERA and FSA practice, but related data are scarce. In this study, hormetic effects on bioluminescence of Aliivibrio fischeri (A. f) induced by sulfachloropyridazine (SCP) under 40 conditions to simulate the real-life scenario were investigated, and hormetic effects on growth of human MCF-7 cells caused by brown sugar and mascavado sugar were found likewise. After comparison of related parameters, it has for the first time been proved that there is a 50% probability for safe concentration (SC) of contaminants and food condiments to fall within the hormetic-stimulatory range (HSR) or left to HSR, revealing the unreliability of traditional parameters in standardized (eco)toxicological studies, and supporting qualitatively and quantitatively the over-strictness of ERA and FSA resulted from misuse of SDR. This study provides a novel perspective for ERA and FSA practitioners that hormesis should dominate and conditions where SDR works should only be singled out on a specific basis.Keywords: dose-response relationship, food safety, ecological risk assessment, hormesis
Procedia PDF Downloads 14639272 A Survey of Recognizing of Daily Living Activities in Multi-User Smart Home Environments
Authors: Kulsoom S. Bughio, Naeem K. Janjua, Gordana Dermody, Leslie F. Sikos, Shamsul Islam
Abstract:
The advancement in information and communication technologies (ICT) and wireless sensor networks have played a pivotal role in the design and development of real-time healthcare solutions, mainly targeting the elderly living in health-assistive smart homes. Such smart homes are equipped with sensor technologies to detect and record activities of daily living (ADL). This survey reviews and evaluates existing approaches and techniques based on real-time sensor-based modeling and reasoning in single-user and multi-user environments. It classifies the approaches into three main categories: learning-based, knowledge-based, and hybrid, and evaluates how they handle temporal relations, granularity, and uncertainty. The survey also highlights open challenges across various disciplines (including computer and information sciences and health sciences) to encourage interdisciplinary research for the detection and recognition of ADLs and discusses future directions.Keywords: daily living activities, smart homes, single-user environment, multi-user environment
Procedia PDF Downloads 141