Search results for: multivariate time series data
36768 A Constitutive Model for Time-Dependent Behavior of Clay
Authors: T. N. Mac, B. Shahbodaghkhan, N. Khalili
Abstract:
A new elastic-viscoplastic (EVP) constitutive model is proposed for the analysis of time-dependent behavior of clay. The proposed model is based on the bounding surface plasticity and the concept of viscoplastic consistency framework to establish continuous transition from plasticity to rate dependent viscoplasticity. Unlike the overstress based models, this model will meet the consistency condition in formulating the constitutive equation for EVP model. The procedure of deriving the constitutive relationship is also presented. Simulation results and comparisons with experimental data are then presented to demonstrate the performance of the model.Keywords: bounding surface, consistency theory, constitutive model, viscosity
Procedia PDF Downloads 49236767 Use of Numerical Tools Dedicated to Fire Safety Engineering for the Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study shows the opportunity to use numerical tools dedicated to Fire Safety Engineering for the Rolling Stock. Indeed, some lawful requirements can now be demonstrated by using numerical tools. The first part of this study presents the use of modelling evacuation tool to satisfy the criteria of evacuation time for the rolling stock. The buildingEXODUS software is used to model and simulate the evacuation of rolling stock. Firstly, in order to demonstrate the reliability of this tool to calculate the complete evacuation time, a comparative study was achieved between a real test and simulations done with buildingEXODUS. Multiple simulations are performed to capture the stochastic variations in egress times. Then, a new study is done to calculate the complete evacuation time of a train with the same geometry but with a different interior architecture. The second part of this study shows some applications of Computational Fluid Dynamics. This work presents the approach of a multi scales validation of numerical simulations of standardized tests with Fire Dynamics Simulations software developed by the National Institute of Standards and Technology (NIST). This work highlights in first the cone calorimeter test, described in the standard ISO 5660, in order to characterize the fire reaction of materials. The aim of this process is to readjust measurement results from the cone calorimeter test in order to create a data set usable at the seat scale. In the second step, the modelisation concerns the fire seat test described in the standard EN 45545-2. The data set obtained thanks to the validation of the cone calorimeter test was set up in the fire seat test. To conclude with the third step, after controlled the data obtained for the seat from the cone calorimeter test, a larger scale simulation with a real part of train is achieved.Keywords: fire safety engineering, numerical tools, rolling stock, multi-scales validation
Procedia PDF Downloads 30336766 High Accuracy Analytic Approximation for Special Functions Applied to Bessel Functions J₀(x) and Its Zeros
Authors: Fernando Maass, Pablo Martin, Jorge Olivares
Abstract:
The Bessel function J₀(x) is very important in Electrodynamics and Physics, as well as its zeros. In this work, a method to obtain high accuracy approximation is presented through an application to that function. In most of the applications of this function, the values of the zeros are very important. In this work, analytic approximations for this function have been obtained valid for all positive values of the variable x, which have high accuracy for the function as well as for the zeros. The approximation is determined by the simultaneous used of the power series and asymptotic expansion. The structure of the approximation is a combination of two rational functions with elementary functions as trigonometric and fractional powers. Here us in Pade method, rational functions are used, but now there combined with elementary functions us fractional powers hyperbolic or trigonometric functions, and others. The reason of this is that now power series of the exact function are used, but together with the asymptotic expansion, which usually includes fractional powers trigonometric functions and other type of elementary functions. The approximation must be a bridge between both expansions, and this can not be accomplished using only with rational functions. In the simplest approximation using 4 parameters the maximum absolute error is less than 0.006 at x ∼ 4.9. In this case also the maximum relative error for the zeros is less than 0.003 which is for the second zero, but that value decreases rapidly for the other zeros. The same kind of behaviour happens for the relative error of the maximum and minimum of the functions. Approximations with higher accuracy and more parameters will be also shown. All the approximations are valid for any positive value of x, and they can be calculated easily.Keywords: analytic approximations, asymptotic approximations, Bessel functions, quasirational approximations
Procedia PDF Downloads 25136765 Geomagnetic Jerks Observed in Geomagnetic Observatory Data Over Southern Africa Between 2017 and 2023
Authors: Sanele Lionel Khanyile, Emmanuel Nahayo
Abstract:
Geomagnetic jerks are jumps observed in the second derivative of the main magnetic field that occurs on annual to decadal timescales. Understanding these jerks is crucial as they provide valuable insights into the complex dynamics of the Earth’s outer liquid core. In this study, we investigate the occurrence of geomagnetic jerks in geomagnetic observatory data collected at southern African magnetic observatories, Hermanus (HER), Tsumeb (TSU), Hartebeesthoek (HBK) and Keetmanshoop (KMH) between 2017 and 2023. The observatory data was processed and analyzed by retaining quiet night-time data recorded during quiet geomagnetic activities with the help of Kp, Dst, and ring current RC indices. Results confirm the occurrence of the 2019-2020 geomagnetic jerk in the region and identify the recent 2021 jerk detected with V-shaped secular variation changes in X and Z components at all four observatories. The highest estimated 2021 jerk secular acceleration amplitudes in X and Z components were found at HBK, 12.7 nT/year² and 19. 1 nT/year², respectively. Notably, the global CHAOS-7 model aptly identifies this 2021 jerk in the Z component at all magnetic observatories in the region.Keywords: geomagnetic jerks, secular variation, magnetic observatory data, South Atlantic Anomaly
Procedia PDF Downloads 7336764 Generative AI: A Comparison of Conditional Tabular Generative Adversarial Networks and Conditional Tabular Generative Adversarial Networks with Gaussian Copula in Generating Synthetic Data with Synthetic Data Vault
Authors: Lakshmi Prayaga, Chandra Prayaga. Aaron Wade, Gopi Shankar Mallu, Harsha Satya Pola
Abstract:
Synthetic data generated by Generative Adversarial Networks and Autoencoders is becoming more common to combat the problem of insufficient data for research purposes. However, generating synthetic data is a tedious task requiring extensive mathematical and programming background. Open-source platforms such as the Synthetic Data Vault (SDV) and Mostly AI have offered a platform that is user-friendly and accessible to non-technical professionals to generate synthetic data to augment existing data for further analysis. The SDV also provides for additions to the generic GAN, such as the Gaussian copula. We present the results from two synthetic data sets (CTGAN data and CTGAN with Gaussian Copula) generated by the SDV and report the findings. The results indicate that the ROC and AUC curves for the data generated by adding the layer of Gaussian copula are much higher than the data generated by the CTGAN.Keywords: synthetic data generation, generative adversarial networks, conditional tabular GAN, Gaussian copula
Procedia PDF Downloads 8236763 Effectiveness of Group Therapy Based on Acceptance and Commitment on Self-Criticism and Coping Mechanism in People with Addiction
Authors: Mohamad Reza Khodabakhsh
Abstract:
Drug use and addiction are major biological, psychological, and social problems. In drug abuse treatment, it is important to pay attention to personality problems and coping methods of patients. Today, the third-wave treatments in psychotherapy emphasize people's awareness and acceptance of feelings and emotions, cognitions, and behaviors instead of challenging cognitions. For this reason, this research was conducted with the aim of investigating the effectiveness of group therapy based on acceptance and commitment to self-criticism and coping strategies of people with drug use disorder. This research was a quasi-experimental type of research (pre-test-post-test design with an unequal control group), and the statistical population of this research included all men with drug use disorder in Mashhad, 174 of whom among the 75 people eligible for this research, 30 of them were selected by available sampling method and randomly assigned to two experimental and control groups. In this research, Gilbert's self-criticism scale was used to measure self-criticism, and Andler and Barker's coping strategies questionnaire was used to measure coping strategies. Therapeutic intervention (treatment based on acceptance and commitment) was performed on the experimental group for eight sessions of 90 minutes, and then post-tests were taken from both groups, and multivariate analysis of covariance (MANCOVA) was used to analyze the data. The results showed that treatment based on acceptance and commitment significantly reduced self-criticism and improved coping strategies used by patients with drug use disorder (p>0.01). Therefore, treatment based on acceptance and commitment has been effective in reducing self-criticism and improving the coping strategies of patients with drug use disorder due to teaching clients to accept thoughts and conditions.Keywords: treatment based on acceptance and commitment, self-criticism, coping strategies, addiction
Procedia PDF Downloads 8836762 Facies Sedimentology and Astronomic Calibration of the Reinech Member (Lutetian)
Authors: Jihede Haj Messaoud, Hamdi Omar, Hela Fakhfakh Ben Jemia, Chokri Yaich
Abstract:
The Upper Lutetian alternating marl–limestone succession of Reineche Member was deposited over a warm shallow carbonate platform that permits Nummulites proliferation. High-resolution studies of 30 meters thick Nummulites-bearing Reineche Member, cropping out in Central Tunisia (Jebel Siouf), have been undertaken, regarding pronounced cyclical sedimentary sequences, in order to investigate the periodicity of cycles and their related orbital-scale oceanic and climatic changes. The palaeoenvironmental and palaeoclimatic data are preserved in several proxies obtainable through high-resolution sampling and laboratories measurement and analysis as magnetic susceptibility (MS) and carbonates contents in conjunction with a wireline logging tools. The time series analysis of proxies permits to establish cyclicity orders present in the studied intervals which could be linked to the orbital cycles. MS records provide high-resolution proxies for relative sea level change in Late Lutetian strata. The spectral analysis of MS fluctuations confirmed the orbital forcing by the presence of the complete suite of orbital frequencies in the precession of 23 ka, the obliquity of 41 ka, and notably the two modes of eccentricity of 100 and 405 ka. Regarding the two periodic sedimentary cycles detected by wavelet analysis of proxy fluctuations which coincide with the long-term 405 ka eccentricity cycle, the Reineche Member spanned 0,8 Myr. Wireline logging tools as gamma ray and sonic were used as a proxies to decipher cyclicity and trends in sedimentation and contribute to identifying and correlate units. There are used to constraint the highest frequency cyclicity modulated by a long term wavelength cycling apparently controlled by clay content. Interpreted as a result of variations in carbonate productivity, it has been suggested that the marl-limestone couplets, represent the sedimentary response to the orbital forcing. The calculation of cycle durations through Reineche Member, is used as a geochronometer and permit the astronomical calibration of the geologic time scale. Furthermore, MS coupled with carbonate contents, and fossil occurrences provide strong evidence for combined detrital inputs and marine surface carbonate productivity cycles. These two synchronous processes were driven by the precession index and ‘fingerprinted’ in the basic marl–limestone couplets, modulated by orbital eccentricity.Keywords: magnetic susceptibility, cyclostratigraphy, orbital forcing, spectral analysis, Lutetian
Procedia PDF Downloads 29436761 Dysfunctional Behavior of External Auditors, The Collision of Time Budget and Time Deadline
Authors: Rabih Nehme, Abdullah Al Mutawa
Abstract:
The general goal behind this research is to gain a better understanding of factors leading to dysfunctional behavior of auditors. Recent accounting scandals -Enron, Waste Management Inc., WorldCom, Xerox Corporation, etc. -provided an ample proof of how the role of auditors has become the basis of controversial debates in many circles and instances in our modern time. The majority of lawsuits and accounting scandals seem to have a central topic in focus, namely the question ''Where were the auditors? The survey we offer up for research is made up of 34 questions that are designed to analyse the perception of auditors and the cause of dysfunctional behavior. The object of this research is comprised of auditors positioned and employed at the Big Four audit firms in Kuwait. Dysfunctional behavior (DB) is measured against two signal proxies of dysfunctional behavior; premature sign-off and under reporting of chargeable time. DB is analysed against time budget pressure and time deadline pressure. The research results' suggest that the general belief among auditors is that the profession of accountancy predetermines their tendency to commit certain patterns of dysfunctional behavior. Having our investigation conducted at the Big Four audit firms, we have come to the conclusion that there is a general difference in behavior patterns among perceptions of dysfunctional behavior and normal skeptic professional behavior.Keywords: big four, dysfunctional behavior, time budget, time deadline
Procedia PDF Downloads 47136760 An Evaluation of Neuropsychiatric Manifestations in Systemic Lupus Erythematosus Patients in Saudi Arabia and Their Associated Factors
Authors: Yousef M. Alammari, Mahmoud A. Gaddoury, Reem A. Almohaini, Sara A. Alharbi, Lena S. Alsaleem, Lujain H. Allowaihiq, Maha H. Alrashid, Abdullah H. Alghamdi, Abdullah A. Alaryni
Abstract:
Objective: The goal of this study was to establish the prevalence of neuropsychiatric symptoms in systemic lupus erythematosus (NPSLE) patients in Saudi Arabia and the variables that are linked to it. Methods: During June 2021, this cross-sectional study was carried out among SLE patients in Saudi Arabia. The Saudi Rheumatism Association exploited social media platforms to provide a self-administered online questionnaire to SLE patients. All data analyses were performed using the Statistical Packages for Social Sciences (SPSS) version 26. Results: Two hundred and five SLE patients participated in the study (females 91.3 % vs. males 8.7 %). In addition, 13.5 % of patients had a family history of SLE, and 26% had SLE for one to three years. Alteration or loss of sensation (53.4%), Fear (52.4%), and headache (48.1%) were the most prevalent signs of neuropsychiatric symptoms in systemic lupus erythematosus (NPSLE) patients. The prevalence of patients with NPSLE was 40%. In a multivariate regression model, fear, altered sensations, cerebrovascular illness, sleep disruption, and diminished interest in routine activities were identified as independent risk variables for NPSLE. Conclusion: Nearly half of SLE patients demonstrated NP manifestations, with significant symptoms including fear, alteration of sensation, cerebrovascular disease, sleep disturbance, and reduced interest in normal activities. To detect the pathophysiology of NPSLE, it is necessary to understand the relationship between neuropsychiatric morbidity and other relevant rheumatic disorders in the SLE population.Keywords: neuropsychiatric, systemic lupus erythematosus, NPSLE, prevalence, SLE patients
Procedia PDF Downloads 7536759 Forecasting Unemployment Rate in Selected European Countries Using Smoothing Methods
Authors: Ksenija Dumičić, Anita Čeh Časni, Berislav Žmuk
Abstract:
The aim of this paper is to select the most accurate forecasting method for predicting the future values of the unemployment rate in selected European countries. In order to do so, several forecasting techniques adequate for forecasting time series with trend component, were selected, namely: double exponential smoothing (also known as Holt`s method) and Holt-Winters` method which accounts for trend and seasonality. The results of the empirical analysis showed that the optimal model for forecasting unemployment rate in Greece was Holt-Winters` additive method. In the case of Spain, according to MAPE, the optimal model was double exponential smoothing model. Furthermore, for Croatia and Italy the best forecasting model for unemployment rate was Holt-Winters` multiplicative model, whereas in the case of Portugal the best model to forecast unemployment rate was Double exponential smoothing model. Our findings are in line with European Commission unemployment rate estimates.Keywords: European Union countries, exponential smoothing methods, forecast accuracy unemployment rate
Procedia PDF Downloads 36936758 Regression Approach for Optimal Purchase of Hosts Cluster in Fixed Fund for Hadoop Big Data Platform
Authors: Haitao Yang, Jianming Lv, Fei Xu, Xintong Wang, Yilin Huang, Lanting Xia, Xuewu Zhu
Abstract:
Given a fixed fund, purchasing fewer hosts of higher capability or inversely more of lower capability is a must-be-made trade-off in practices for building a Hadoop big data platform. An exploratory study is presented for a Housing Big Data Platform project (HBDP), where typical big data computing is with SQL queries of aggregate, join, and space-time condition selections executed upon massive data from more than 10 million housing units. In HBDP, an empirical formula was introduced to predict the performance of host clusters potential for the intended typical big data computing, and it was shaped via a regression approach. With this empirical formula, it is easy to suggest an optimal cluster configuration. The investigation was based on a typical Hadoop computing ecosystem HDFS+Hive+Spark. A proper metric was raised to measure the performance of Hadoop clusters in HBDP, which was tested and compared with its predicted counterpart, on executing three kinds of typical SQL query tasks. Tests were conducted with respect to factors of CPU benchmark, memory size, virtual host division, and the number of element physical host in cluster. The research has been applied to practical cluster procurement for housing big data computing.Keywords: Hadoop platform planning, optimal cluster scheme at fixed-fund, performance predicting formula, typical SQL query tasks
Procedia PDF Downloads 23236757 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study
Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos
Abstract:
This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.Keywords: in-place devices, IoT, human-centred data-analytics, spatial design
Procedia PDF Downloads 19736756 A Series of Teaching Modules to Prepare International Students for Real-World China
Authors: Jui-Chien Wang
Abstract:
Because of China’s continued economic growth and dominance, increasingly many students of Chinese from western countries are interested in pursuing careers related to China. Unless we do more to teach them about contemporary Chinese society and Chinese cultural codes, however, few will be able to do so successfully. Most traditional language textbooks treat these topics only cursorily, and, because of the rapid pace of China’s social and economic development, what they do cover is frequently outdated and insufficient. However, understanding contemporary Chinese society and Chinese cultural codes is essential to successfully negotiating real-world China. The current paper details one of the main ways in which the presenter has dealt with this educational lacuna: the development and implementation of a series of teaching modules for advanced Chinese language classes. Each module explores a particular area, provides resources, and raises questions to engage students in strengthening their language and cultural competencies. The teaching modules address four main areas: (1) Chinese behavioral culture; (2) critical issues in contemporary China; (3) current events in China; and (4) great social transformations in contemporary China. The presenter will also discuss lessons learned and insights gained during the development and implementation process as well as the benefits of using these modules. In addition, the presenter will offer suggestions for the application of these modules, so that other language teachers will be able to make better use of them in their own classrooms.Keywords: behavioral culture, contemporary Chinese society, cultural code, teaching module
Procedia PDF Downloads 26636755 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 4236754 Combining the Dynamic Conditional Correlation and Range-GARCH Models to Improve Covariance Forecasts
Authors: Piotr Fiszeder, Marcin Fałdziński, Peter Molnár
Abstract:
The dynamic conditional correlation model of Engle (2002) is one of the most popular multivariate volatility models. However, this model is based solely on closing prices. It has been documented in the literature that the high and low price of the day can be used in an efficient volatility estimation. We, therefore, suggest a model which incorporates high and low prices into the dynamic conditional correlation framework. Empirical evaluation of this model is conducted on three datasets: currencies, stocks, and commodity exchange-traded funds. The utilisation of realized variances and covariances as proxies for true variances and covariances allows us to reach a strong conclusion that our model outperforms not only the standard dynamic conditional correlation model but also a competing range-based dynamic conditional correlation model.Keywords: volatility, DCC model, high and low prices, range-based models, covariance forecasting
Procedia PDF Downloads 18336753 Performance Enhancement of Hybrid Racing Car by Design Optimization
Authors: Tarang Varmora, Krupa Shah, Karan Patel
Abstract:
Environmental pollution and shortage of conventional fuel are the main concerns in the transportation sector. Most of the vehicles use an internal combustion engine (ICE), powered by gasoline fuels. This results into emission of toxic gases. Hybrid electric vehicle (HEV) powered by electric machine and ICE is capable of reducing emission of toxic gases and fuel consumption. However to build HEV, it is required to accommodate motor and batteries in the vehicle along with engine and fuel tank. Thus, overall weight of the vehicle increases. To improve the fuel economy and acceleration, the weight of the HEV can be minimized. In this paper, the design methodology to reduce the weight of the hybrid racing car is proposed. To this end, the chassis design is optimized. Further, attempt is made to obtain the maximum strength with minimum material weight. The best configuration out of the three main configurations such as series, parallel and the dual-mode (series-parallel) is chosen. Moreover, the most suitable type of motor, battery, braking system, steering system and suspension system are identified. The racing car is designed and analyzed in the simulating software. The safety of the vehicle is assured by performing static and dynamic analysis on the chassis frame. From the results, it is observed that, the weight of the racing car is reduced by 11 % without compromising on safety and cost. It is believed that the proposed design and specifications can be implemented practically for manufacturing hybrid racing car.Keywords: design optimization, hybrid racing car, simulation, vehicle, weight reduction
Procedia PDF Downloads 29436752 The System-Dynamic Model of Sustainable Development Based on the Energy Flow Analysis Approach
Authors: Inese Trusina, Elita Jermolajeva, Viktors Gopejenko, Viktor Abramov
Abstract:
Global challenges require a transition from the existing linear economic model to a model that will consider nature as a life support system for the development of the way to social well-being in the frame of the ecological economics paradigm. The objective of the article is to present the results of the analysis of socio-economic systems in the context of sustainable development using the systems power (energy flows) changes analyzing method and structural Kaldor's model of GDP. In accordance with the principles of life's development and the ecological concept was formalized the tasks of sustainable development of the open, non-equilibrium, stable socio-economic systems were formalized using the energy flows analysis method. The methodology of monitoring sustainable development and level of life were considered during the research of interactions in the system ‘human - society - nature’ and using the theory of a unified system of space-time measurements. Based on the results of the analysis, the time series consumption energy and economic structural model were formulated for the level, degree and tendencies of sustainable development of the system and formalized the conditions of growth, degrowth and stationarity. In order to design the future state of socio-economic systems, a concept was formulated, and the first models of energy flows in systems were created using the tools of system dynamics. During the research, the authors calculated and used a system of universal indicators of sustainable development in the invariant coordinate system in energy units. In order to design the future state of socio-economic systems, a concept was formulated, and the first models of energy flows in systems were created using the tools of system dynamics. In the context of the proposed approach and methods, universal sustainable development indicators were calculated as models of development for the USA and China. The calculations used data from the World Bank database for the period from 1960 to 2019. Main results: 1) In accordance with the proposed approach, the heterogeneous energy resources of countries were reduced to universal power units, summarized and expressed as a unified number. 2) The values of universal indicators of the life’s level were obtained and compared with generally accepted similar indicators.3) The system of indicators in accordance with the requirements of sustainable development can be considered as a basis for monitoring development trends. This work can make a significant contribution to overcoming the difficulties of forming socio-economic policy, which is largely due to the lack of information that allows one to have an idea of the course and trends of socio-economic processes. The existing methods for the monitoring of the change do not fully meet this requirement since indicators have different units of measurement from different areas and, as a rule, are the reaction of socio-economic systems to actions already taken and, moreover, with a time shift. Currently, the inconsistency or inconsistency of measures of heterogeneous social, economic, environmental, and other systems is the reason that social systems are managed in isolation from the general laws of living systems, which can ultimately lead to a systemic crisis.Keywords: sustainability, system dynamic, power, energy flows, development
Procedia PDF Downloads 5836751 The AI Method and System for Analyzing Wound Status in Wound Care Nursing
Authors: Ho-Hsin Lee, Yue-Min Jiang, Shu-Hui Tsai, Jian-Ren Chen, Mei-Yu XU, Wen-Tien Wu
Abstract:
This project presents an AI-based method and system for wound status analysis. The system uses a three-in-one sensor device to analyze wound status, including color, temperature, and a 3D sensor to provide wound information up to 2mm below the surface, such as redness, heat, and blood circulation information. The system has a 90% accuracy rate, requiring only one manual correction in 70% of cases, with a one-second delay. The system also provides an offline application that allows for manual correction of the wound bed range using color-based guidance to estimate wound bed size with 96% accuracy and a maximum of one manual correction in 96% of cases, with a one-second delay. Additionally, AI-assisted wound bed range selection achieves 100% of cases without manual intervention, with an accuracy rate of 76%, while AI-based wound tissue type classification achieves an 85.3% accuracy rate for five categories. The AI system also includes similar case search and expert recommendation capabilities. For AI-assisted wound range selection, the system uses WIFI6 technology, increasing data transmission speeds by 22 times. The project aims to save up to 64% of the time required for human wound record keeping and reduce the estimated time to assess wound status by 96%, with an 80% accuracy rate. Overall, the proposed AI method and system integrate multiple sensors to provide accurate wound information and offer offline and online AI-assisted wound bed size estimation and wound tissue type classification. The system decreases delay time to one second, reduces the number of manual corrections required, saves time on wound record keeping, and increases data transmission speed, all of which have the potential to significantly improve wound care and management efficiency and accuracy.Keywords: wound status analysis, AI-based system, multi-sensor integration, color-based guidance
Procedia PDF Downloads 11536750 The Optical OFDM Equalization Based on the Fractional Fourier Transform
Authors: A. Cherifi, B. S. Bouazza, A. O. Dahman, B. Yagoubi
Abstract:
Transmission over Optical channels will introduce inter-symbol interference (ISI) as well as inter-channel (or inter-carrier) interference (ICI). To decrease the effects of ICI, this paper proposes equalizer for the Optical OFDM system based on the fractional Fourier transform (FrFFT). In this FrFT-OFDM system, traditional Fourier transform is replaced by fractional Fourier transform to modulate and demodulate the data symbols. The equalizer proposed consists of sampling the received signal in the different time per time symbol. Theoretical analysis and numerical simulation are discussed.Keywords: OFDM, fractional fourier transform, internet and information technology
Procedia PDF Downloads 40636749 Reliability Modeling of Repairable Subsystems in Semiconductor Fabrication: A Virtual Age and General Repair Framework
Authors: Keshav Dubey, Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta
Abstract:
In the semiconductor capital equipment industry, effective modeling of repairable system reliability is crucial for optimizing maintenance strategies and ensuring operational efficiency. However, repairable system reliability modeling using a renewal process is not as popular in the semiconductor equipment industry as it is in the locomotive and automotive industries. Utilization of this approach will help optimize maintenance practices. This paper presents a structured framework that leverages both parametric and non-parametric approaches to model the reliability of repairable subsystems based on operational data, maintenance schedules, and system-specific conditions. Data is organized at the equipment ID level, facilitating trend testing to uncover failure patterns and system degradation over time. For non-parametric modeling, the Mean Cumulative Function (Mean Cumulative Function) approach is applied, offering a flexible method to estimate the cumulative number of failures over time without assuming an underlying statistical distribution. This allows for empirical insights into subsystem failure behavior based on historical data. On the parametric side, virtual age modeling, along with Homogeneous and Non-Homogeneous Poisson Process (Homogeneous Poisson Process and Non-Homogeneous Poisson Process) models, is employed to quantify the effect of repairs and the aging process on subsystem reliability. These models allow for a more structured analysis by characterizing repair effectiveness and system wear-out trends over time. A comparison of various Generalized Renewal Process (GRP) approaches highlights their utility in modeling different repair effectiveness scenarios. These approaches provide a robust framework for assessing the impact of maintenance actions on system performance and reliability. By integrating both parametric and non-parametric methods, this framework offers a comprehensive toolset for reliability engineers to better understand equipment behavior, assess the effectiveness of maintenance activities, and make data-driven decisions that enhance system availability and operational performance in semiconductor fabrication facilities.Keywords: reliability, maintainability, homegenous poission process, repairable system
Procedia PDF Downloads 1936748 Association between Severe Acidemia before Endotracheal Intubation and the Lower First Attempt Intubation Success Rate
Authors: Keiko Naito, Y. Nakashima, S. Yamauchi, Y. Kunitani, Y. Ishigami, K. Numata, M. Mizobe, Y. Homma, J. Takahashi, T. Inoue, T. Shiga, H. Funakoshi
Abstract:
Background: A presence of severe acidemia, defined as pH < 7.2, is common during endotracheal intubation for critically ill patients in the emergency department (ED). Severe acidemia is widely recognized as a predisposing factor for intubation failure. However, it is unclear that acidemic condition itself actually makes endotracheal intubation more difficult. We aimed to evaluate if a presence of severe acidemia before intubation is associated with the lower first attempt intubation success rate in the ED. Methods: This is a retrospective observational cohort study in the ED of an urban hospital in Japan. The collected data included patient demographics, such as age, sex, and body mass index, presence of one or more factors of modified LEMON criteria for predicting difficult intubation, reasons for intubation, blood gas levels, airway equipment, intubation by emergency physician or not, and the use of the rapid sequence intubation technique. Those with any of the following were excluded from the analysis: (1) no blood gas drawn before intubation, (2) cardiopulmonary arrest, and (3) under 18 years of age. The primary outcome was the first attempt intubation success rates between a severe acidemic patients (SA) group and a non-severe acidemic patients (NA) group. Logistic regression analysis was used to test the first attempt success rates for intubations between those two groups. Results: Over 5 years, a total of 486 intubations were performed; 105 in the SA group and 381 in the NA group. The univariate analysis showed that the first attempt intubation success rate was lower in the SA group than in the NA group (71.4% vs 83.5%, p < 0.01). The multivariate logistic regression analysis identified that severe acidemia was significantly associated with the first attempt intubation failure (OR 1.9, 95% CI 1.03-3.68, p = 0.04). Conclusions: A presence of severe acidemia before endotracheal intubation lowers the first attempt intubation success rate in the ED.Keywords: acidemia, airway management, endotracheal intubation, first-attempt intubation success rate
Procedia PDF Downloads 24836747 Comparison of Gait Variability in Individuals with Trans-Tibial and Trans-Femoral Lower Limb Loss: A Pilot Study
Authors: Hilal Keklicek, Fatih Erbahceci, Elif Kirdi, Ali Yalcin, Semra Topuz, Ozlem Ulger, Gul Sener
Abstract:
Objectives and Goals: The stride-to-stride fluctuations in gait is a determinant of qualified locomotion as known as gait variability. Gait variability is an important predictive factor of fall risk and useful for monitoring the effects of therapeutic interventions and rehabilitation. Comparison of gait variability in individuals with trans-tibial lower limb loss and trans femoral lower limb loss was the aim of the study. Methods: Ten individuals with traumatic unilateral trans femoral limb loss(TF), 12 individuals with traumatic transtibial lower limb loss(TT) and 12 healthy individuals(HI) were the participants of the study. All participants were evaluated with treadmill. Gait characteristics including mean step length, step length variability, ambulation index, time on each foot of participants were evaluated with treadmill. Participants were walked at their preferred speed for six minutes. Data from 4th minutes to 6th minutes were selected for statistical analyses to eliminate learning effect. Results: There were differences between the groups in intact limb step length variation, time on each foot, ambulation index and mean age (p < .05) according to the Kruskal Wallis Test. Pairwise analyses showed that there were differences between the TT and TF in residual limb variation (p=.041), time on intact foot (p=.024), time on prosthetic foot(p=.024), ambulation index(p = .003) in favor of TT group. There were differences between the TT and HI group in intact limb variation (p = .002), time on intact foot (p<.001), time on prosthetic foot (p < .001), ambulation index result (p < .001) in favor of HI group. There were differences between the TF and HI group in intact limb variation (p = .001), time on intact foot (p=.01) ambulation index result (p < .001) in favor of HI group. There was difference between the groups in mean age result from HI group were younger (p < .05).There were similarity between the groups in step lengths (p>.05) and time of prosthesis using in individuals with lower limb loss (p > .05). Conclusions: The pilot study provided basic data about gait stability in individuals with traumatic lower limb loss. Results of the study showed that to evaluate the gait differences between in different amputation level, long-range gait analyses methods may be useful to get more valuable information. On the other hand, similarity in step length may be resulted from effective prosthetic using or effective gait rehabilitation, in conclusion, all participants with lower limb loss were already trained. The differences between the TT and HI; TF and HI may be resulted from the age related features, therefore, age matched population in HI were recommended future studies. Increasing the number of participants and comparison of age-matched groups also recommended to generalize these result.Keywords: lower limb loss, amputee, gait variability, gait analyses
Procedia PDF Downloads 28036746 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties
Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda
Abstract:
This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties
Procedia PDF Downloads 6636745 Objective Evaluation on Medical Image Compression Using Wavelet Transformation
Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah
Abstract:
The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation
Procedia PDF Downloads 28536744 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being
Procedia PDF Downloads 7036743 A Privacy Protection Scheme Supporting Fuzzy Search for NDN Routing Cache Data Name
Authors: Feng Tao, Ma Jing, Guo Xian, Wang Jing
Abstract:
Named Data Networking (NDN) replaces IP address of traditional network with data name, and adopts dynamic cache mechanism. In the existing mechanism, however, only one-to-one search can be achieved because every data has a unique name corresponding to it. There is a certain mapping relationship between data content and data name, so if the data name is intercepted by an adversary, the privacy of the data content and user’s interest can hardly be guaranteed. In order to solve this problem, this paper proposes a one-to-many fuzzy search scheme based on order-preserving encryption to reduce the query overhead by optimizing the caching strategy. In this scheme, we use hash value to ensure the user’s query safe from each node in the process of search, so does the privacy of the requiring data content.Keywords: NDN, order-preserving encryption, fuzzy search, privacy
Procedia PDF Downloads 48436742 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 33436741 Temperature-Dependent Post-Mortem Changes in Human Cardiac Troponin-T (cTnT): An Approach in Determining Postmortem Interval
Authors: Sachil Kumar, Anoop Kumar Verma, Wahid Ali, Uma Shankar Singh
Abstract:
Globally approximately 55.3 million people die each year. In the India there were 95 lakh annual deaths in 2013. The number of deaths resulted from homicides, suicides and unintentional injuries in the same period was about 5.7 lakh. The ever-increasing crime rate necessitated the development of methods for determining time since death. An erroneous time of death window can lead investigators down the wrong path or possibly focus a case on an innocent suspect. In this regard a research was carried out by analyzing the temperature dependent degradation of a Cardiac Troponin-T protein (cTnT) in the myocardium postmortem as a marker for time since death. Cardiac tissue samples were collected from (n=6) medico-legal autopsies, (in the Department of Forensic Medicine and Toxicology, King George’s Medical University, Lucknow India) after informed consent from the relatives and studied post-mortem degradation by incubation of the cardiac tissue at room temperature (20±2 OC), 12 0C, 25 0C and 37 0C for different time periods ((~5, 26, 50, 84, 132, 157, 180, 205, and 230 hours). The cases included were the subjects of road traffic accidents (RTA) without any prior history of disease who died in the hospital and their exact time of death was known. The analysis involved extraction of the protein, separation by denaturing gel electrophoresis (SDS-PAGE) and visualization by Western blot using cTnT specific monoclonal antibodies. The area of the bands within a lane was quantified by scanning and digitizing the image using Gel Doc. The data shows a distinct temporal profile corresponding to the degradation of cTnT by proteases found in cardiac muscle. The disappearance of intact cTnT and the appearance of lower molecular weight bands are easily observed. Western blot data clearly showed the intact protein at 42 kDa, two major (27 kDa, 10kDa) fragments, two additional minor fragments (32 kDa) and formation of low molecular weight fragments as time increases. At 12 0C the intensity of band (intact cTnT) decreased steadily as compared to RT, 25 0C and 37 0C. Overall, both PMI and temperature had a statistically significant effect where the greatest amount of protein breakdown was observed within the first 38 h and at the highest temperature, 37 0C. The combination of high temperature (37 0C) and long Postmortem interval (105.15 hrs) had the most drastic effect on the breakdown of cTnT. If the percent intact cTnT is calculated from the total area integrated within a Western blot lane, then the percent intact cTnT shows a pseudo-first order relationship when plotted against the log of the time postmortem. These plots show a good coefficient of correlation of r = 0.95 (p=0.003) for the regression of the human heart at different temperature conditions. The data presented demonstrates that this technique can provide an extended time range during which Postmortem interval can be more accurately estimated.Keywords: degradation, postmortem interval, proteolysis, temperature, troponin
Procedia PDF Downloads 38636740 The Effectiveness and Accuracy of the Schulte Holt IOL Toric Calculator Processor in Comparison to Manually Input Data into the Barrett Toric IOL Calculator
Authors: Gabrielle Holt
Abstract:
This paper is looking to prove the efficacy of the Schulte Holt IOL Toric Calculator Processor (Schulte Holt ITCP). It has been completed using manually inputted data into the Barrett Toric Calculator and comparing the number of minutes taken to complete the Toric calculations, the number of errors identified during completion, and distractions during completion. It will then compare that data to the number of minutes taken for the Schulte Holt ITCP to complete also, using the Barrett method, as well as the number of errors identified in the Schulte Holt ITCP. The data clearly demonstrate a momentous advantage to the Schulte Holt ITCP and notably reduces time spent doing Toric Calculations, as well as reducing the number of errors. With the ever-growing number of cataract surgeries taking place around the world and the waitlists increasing -the Schulte Holt IOL Toric Calculator Processor may well demonstrate a way forward to increase the availability of ophthalmologists and ophthalmic staff while maintaining patient safety.Keywords: Toric, toric lenses, ophthalmology, cataract surgery, toric calculations, Barrett
Procedia PDF Downloads 9436739 Modeling and Performance Evaluation of an Urban Corridor under Mixed Traffic Flow Condition
Authors: Kavitha Madhu, Karthik K. Srinivasan, R. Sivanandan
Abstract:
Indian traffic can be considered as mixed and heterogeneous due to the presence of various types of vehicles that operate with weak lane discipline. Consequently, vehicles can position themselves anywhere in the traffic stream depending on availability of gaps. The choice of lateral positioning is an important component in representing and characterizing mixed traffic. The field data provides evidence that the trajectory of vehicles in Indian urban roads have significantly varying longitudinal and lateral components. Further, the notion of headway which is widely used for homogeneous traffic simulation is not well defined in conditions lacking lane discipline. From field data it is clear that following is not strict as in homogeneous and lane disciplined conditions and neighbouring vehicles ahead of a given vehicle and those adjacent to it could also influence the subject vehicles choice of position, speed and acceleration. Given these empirical features, the suitability of using headway distributions to characterize mixed traffic in Indian cities is questionable, and needs to be modified appropriately. To address these issues, this paper attempts to analyze the time gap distribution between consecutive vehicles (in a time-sense) crossing a section of roadway. More specifically, to characterize the complex interactions noted above, the influence of composition, manoeuvre types, and lateral placement characteristics on time gap distribution is quantified in this paper. The developed model is used for evaluating various performance measures such as link speed, midblock delay and intersection delay which further helps to characterise the vehicular fuel consumption and emission on urban roads of India. Identifying and analyzing exact interactions between various classes of vehicles in the traffic stream is essential for increasing the accuracy and realism of microscopic traffic flow modelling. In this regard, this study aims to develop and analyze time gap distribution models and quantify it by lead lag pair, manoeuvre type and lateral position characteristics in heterogeneous non-lane based traffic. Once the modelling scheme is developed, this can be used for estimating the vehicle kilometres travelled for the entire traffic system which helps to determine the vehicular fuel consumption and emission. The approach to this objective involves: data collection, statistical modelling and parameter estimation, simulation using calibrated time-gap distribution and its validation, empirical analysis of simulation result and associated traffic flow parameters, and application to analyze illustrative traffic policies. In particular, video graphic methods are used for data extraction from urban mid-block sections in Chennai, where the data comprises of vehicle type, vehicle position (both longitudinal and lateral), speed and time gap. Statistical tests are carried out to compare the simulated data with the actual data and the model performance is evaluated. The effect of integration of above mentioned factors in vehicle generation is studied by comparing the performance measures like density, speed, flow, capacity, area occupancy etc under various traffic conditions and policies. The implications of the quantified distributions and simulation model for estimating the PCU (Passenger Car Units), capacity and level of service of the system are also discussed.Keywords: lateral movement, mixed traffic condition, simulation modeling, vehicle following models
Procedia PDF Downloads 342