Search results for: real time data processing
39420 Impact of Audit Committee on Real Earnings Management: Cases of Netherlands
Authors: Sana Masmoudi Mardassi, Yosra Makni Fourati
Abstract:
Regulators highlight the importance of the Audit Committee (AC) as a key internal corporate governance mechanism. One of the most important roles of this committee is to oversee the financial reporting process. The purpose of this paper is to examine the link between the characteristics of an audit committee and the financial reporting quality by investigating whether the characteristics of audit committees are associated with improved financial reporting quality, especially the Real Earnings Management. In the current study, a panel data from 80 nonfinancial companies listed on the Amsterdam Stock Exchange during the period between 2010 and 2017 were used. To measure audit committee characteristics, four proxies have been used, specifically, audit committee independence, financial expertise, gender diversity and AC meetings. For this research, a linear regression model was used to identify the influence of a set of board characteristics of the audit committee on real earnings management after controlling for firm audit committee size, leverage, size, loss, growth and board size. This research provides empirical evidence of the association between audit committee independence, financial expertise, gender diversity and meetings and Real Earnings Management (REM) as a proxy of financial reporting quality. The study finds that independence and AC Gender diversity are strongly related to financial reporting quality. In fact, these two characteristics constrain REM. The results also suggest that AC- financial expertise reduces to some extent, the likelihood of engaging in REM. These conclusions provide support then to the audit committee requirement under the Dutch Corporate Governance Code rules regarding gender diversity and AC meetings.Keywords: audit committee, financial expertise, independence, real earnings management
Procedia PDF Downloads 16739419 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 14439418 Influence of Travel Time Reliability on Elderly Drivers Crash Severity
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling
Procedia PDF Downloads 49339417 Enhancing Athlete Training using Real Time Pose Estimation with Neural Networks
Authors: Jeh Patel, Chandrahas Paidi, Ahmed Hambaba
Abstract:
Traditional methods for analyzing athlete movement often lack the detail and immediacy required for optimal training. This project aims to address this limitation by developing a Real-time human pose estimation system specifically designed to enhance athlete training across various sports. This system leverages the power of convolutional neural networks (CNNs) to provide a comprehensive and immediate analysis of an athlete’s movement patterns during training sessions. The core architecture utilizes dilated convolutions to capture crucial long-range dependencies within video frames. Combining this with the robust encoder-decoder architecture to further refine pose estimation accuracy. This capability is essential for precise joint localization across the diverse range of athletic poses encountered in different sports. Furthermore, by quantifying movement efficiency, power output, and range of motion, the system provides data-driven insights that can be used to optimize training programs. Pose estimation data analysis can also be used to develop personalized training plans that target specific weaknesses identified in an athlete’s movement patterns. To overcome the limitations posed by outdoor environments, the project employs strategies such as multi-camera configurations or depth sensing techniques. These approaches can enhance pose estimation accuracy in challenging lighting and occlusion scenarios, where pose estimation accuracy in challenging lighting and occlusion scenarios. A dataset is collected From the labs of Martin Luther King at San Jose State University. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing different poses, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced pose detection model and lays the groundwork for future innovations in assistive enhancement technologies.Keywords: computer vision, deep learning, human pose estimation, U-NET, CNN
Procedia PDF Downloads 5639416 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata
Authors: Pavan K. Rallabandi, Kailash C. Patidar
Abstract:
In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence or pattern recognition/ classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.Keywords: hybrid systems, hidden markov models, recurrent neural networks, deterministic finite state automata
Procedia PDF Downloads 38839415 Minimizing Total Completion Time in No-Wait Flowshops with Setup Times
Authors: Ali Allahverdi
Abstract:
The m-machine no-wait flowshop scheduling problem is addressed in this paper. The objective is to minimize total completion time subject to the constraint that the makespan value is not greater than a certain value. Setup times are treated as separate from processing times. Several recent algorithms are adapted and proposed for the problem. An extensive computational analysis has been conducted for the evaluation of the proposed algorithms. The computational analysis indicates that the best proposed algorithm performs significantly better than the earlier existing best algorithm.Keywords: scheduling, no-wait flowshop, algorithm, setup times, total completion time, makespan
Procedia PDF Downloads 34039414 Novel Recommender Systems Using Hybrid CF and Social Network Information
Authors: Kyoung-Jae Kim
Abstract:
Collaborative Filtering (CF) is a popular technique for the personalization in the E-commerce domain to reduce information overload. In general, CF provides recommending items list based on other similar users’ preferences from the user-item matrix and predicts the focal user’s preference for particular items by using them. Many recommender systems in real-world use CF techniques because it’s excellent accuracy and robustness. However, it has some limitations including sparsity problems and complex dimensionality in a user-item matrix. In addition, traditional CF does not consider the emotional interaction between users. In this study, we propose recommender systems using social network and singular value decomposition (SVD) to alleviate some limitations. The purpose of this study is to reduce the dimensionality of data set using SVD and to improve the performance of CF by using emotional information from social network data of the focal user. In this study, we test the usability of hybrid CF, SVD and social network information model using the real-world data. The experimental results show that the proposed model outperforms conventional CF models.Keywords: recommender systems, collaborative filtering, social network information, singular value decomposition
Procedia PDF Downloads 28939413 Financial Assessment of the Hard Coal Mining in the Chosen Region in the Czech Republic: Real Options Methodology Application
Authors: Miroslav Čulík, Petr Gurný
Abstract:
This paper is aimed at the financial assessment of the hard coal mining in a given region by real option methodology application. Hard coal mining in this mine makes net loss for the owner during the last years due to the long-term unfavourable mining conditions and significant drop in the coal prices during the last years. Management is going to shut down the operation and abandon the project to reduce the loss of the company. The goal is to assess whether the shutting down the operation is the only and correct solution of the problem. Due to the uncertainty in the future hard coal price evolution, the production might be again restarted if the price raises enough to cover the cost of the production. For the assessment, real option methodology is applied, which captures two important aspect of the financial decision-making: risk and flexibility. The paper is structured as follows: first, current state is described and problem is analysed. Next, methodology of real options is described. At last, project is evaluated by applying real option methodology. The results are commented and recommendations are provided.Keywords: real option, investment, option to abandon, option to shut down and restart, risk, flexibility
Procedia PDF Downloads 54839412 A Forward-Looking View of the Intellectual Capital Accounting Information System
Authors: Rbiha Salsabil Ketitni
Abstract:
The entire company is a series of information among themselves so that each information serves several events and activities, and the latter is nothing but a large set of data or huge data. The enormity of information leads to the possibility of losing it sometimes, and this possibility must be avoided in the institution, especially the information that has a significant impact on it. In most cases, to avoid the loss of this information and to be relatively correct, information systems are used. At present, it is impossible to have a company that does not have information systems, as the latter works to organize the information as well as to preserve it and even saves time for its owner and this is the result of the speed of its mission. This study aims to provide an idea of an accounting information system that opens a forward-looking study for its manufacture and development by researchers, scientists, and professionals. This is the result of most individuals seeing a great contradiction between the work of an information system for moral capital and does not provide real values when measured, and its disclosure in financial reports is not distinguished by transparency.Keywords: accounting, intellectual capital, intellectual capital accounting, information system
Procedia PDF Downloads 8539411 Comparison of Due Date Assignment Rules in a Dynamic Job Shop
Authors: Mumtaz Ipek, Burak Erkayman
Abstract:
Due date is assigned as an input for scheduling problems. At the same time, due date is selected as a decision variable for real time scheduling applications. Correct determination of due dates increases shop floor performance and number of jobs completed on time. This subject has been mentioned widely in the literature. Moreover rules for due date determination have been developed from analytical analysis. When a job arrives to the shop floor, a due date is assigned for delivery. Various due date determination methods are used in the literature. In this study six different due date methods are implemented for a hypothetical dynamic job shop and the performances of the due date methods are compared.Keywords: scheduling, dynamic job shop, due date assignment, management engineering
Procedia PDF Downloads 55339410 Analyzing the Market Growth in Application Programming Interface Economy Using Time-Evolving Model
Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata
Abstract:
API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding the dynamics of a market of API economy under the strategies of participants is crucial to fully maximize the values of the API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competition, and the profit of the platform increases.Keywords: API economy, ecosystem, platform, API providers
Procedia PDF Downloads 9139409 Simulation Analysis of Wavelength/Time/Space Codes Using CSRZ and DPSK-RZ Formats for Fiber-Optic CDMA Systems
Authors: Jaswinder Singh
Abstract:
In this paper, comparative analysis is carried out to study the performance of wavelength/time/space optical CDMA codes using two well-known formats; those are CSRZ and DPSK-RZ using RSoft’s OptSIM. The analysis is carried out under the real-like scenario considering the presence of various non-linear effects such as XPM, SPM, SRS, SBS and FWM. Fiber dispersion and the multiple access interference are also considered. The codes used in this analysis are 3-D wavelength/time/space codes. These are converted into 2-D wavelength-time codes so that their requirement of space couplers and fiber ribbons is eliminated. Under the conditions simulated, this is found that CSRZ performs better than DPSK-RZ for fiber-optic CDMA applications.Keywords: Optical CDMA, Multiple access interference (MAI), CSRZ, DPSK-RZ
Procedia PDF Downloads 64539408 Research on Evaluation Method of Urban Road Section Traffic Safety Status Based on Video Information
Authors: Qiang Zhang, Xiaojian Hu
Abstract:
Aiming at the problem of the existing real-time evaluation methods for traffic safety status, a video information-based urban road section traffic safety status evaluation method was established, and the rapid detection method of traffic flow parameters based on video information is analyzed. The concept of the speed dispersion of the road section that affects the traffic safety state of the urban road section is proposed, and the method of evaluating the traffic safety state of the urban road section based on the speed dispersion of the road section is established. Experiments show that the proposed method can reasonably evaluate the safety status of urban roads in real-time, and the evaluation results can provide a corresponding basis for the traffic management department to formulate an effective urban road section traffic safety improvement plan.Keywords: intelligent transportation system, road traffic safety, video information, vehicle speed dispersion
Procedia PDF Downloads 16439407 Validation of a Placebo Method with Potential for Blinding in Ultrasound-Guided Dry Needling
Authors: Johnson C. Y. Pang, Bo Peng, Kara K. L. Reeves, Allan C. L. Fud
Abstract:
Objective: Dry needling (DN) has long been used as a treatment method for various musculoskeletal pain conditions. However, the evidence level of the studies was low due to the limitations of the methodology. Lack of randomization and inappropriate blinding is potentially the main sources of bias. A method that can differentiate clinical results due to the targeted experimental procedure from its placebo effect is needed to enhance the validity of the trial. Therefore, this study aimed to validate the method as a placebo ultrasound(US)-guided DN for patients with knee osteoarthritis (KOA). Design: This is a randomized controlled trial (RCT). Ninety subjects (25 males and 65 females) aged between 51 and 80 (61.26 ± 5.57) with radiological KOA were recruited and randomly assigned into three groups with a computer program. Group 1 (G1) received real US-guided DN, Group 2 (G2) received placebo US-guided DN, and Group 3 (G3) was the control group. Both G1 and G2 subjects received the same procedure of US-guided DN, except the US monitor was turned off in G2, blinding the G2 subjects to the incorporation of faux US guidance. This arrangement created the placebo effect intended to permit comparison of their results to those who received actual US-guided DN. Outcome measures, including the visual analog scale (VAS) and Knee injury and Osteoarthritis Outcome Score (KOOS) subscales of pain, symptoms, and quality of life (QOL), were analyzed by repeated measures analysis of covariance (ANCOVA) for time effects and group effects. The data regarding the perception of receiving real US-guided DN or placebo US-guided DN were analyzed by the chi-squared test. The missing data were analyzed with the intention-to-treat (ITT) approach if more than 5% of the data were missing. Results: The placebo US-guided DN (G2) subjects had the same perceptions as the use of real US guidance in the advancement of DN (p<0.128). G1 had significantly higher pain reduction (VAS and KOOS-pain) than G2 and G3 at 8 weeks (both p<0.05) only. There was no significant difference between G2 and G3 at 8 weeks (both p>0.05). Conclusion: The method with the US monitor turned off during the application of DN is credible for blinding the participants and allowing researchers to incorporate faux US guidance. The validated placebo US-guided DN technique can aid in investigations of the effects of US-guided DN with short-term effects of pain reduction for patients with KOA. Acknowledgment: This work was supported by the Caritas Institute of Higher Education [grant number IDG200101].Keywords: ultrasound-guided dry needling, dry needling, knee osteoarthritis, physiotheraphy
Procedia PDF Downloads 12039406 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring
Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie
Abstract:
Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement
Procedia PDF Downloads 1139405 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency
Procedia PDF Downloads 27139404 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework
Authors: Abbas Raza Ali
Abstract:
Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation
Procedia PDF Downloads 17539403 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 24239402 Investigating Dynamic Transition Process of Issues Using Unstructured Text Analysis
Authors: Myungsu Lim, William Xiu Shun Wong, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Namgyu Kim
Abstract:
The amount of real-time data generated through various mass media has been increasing rapidly. In this study, we had performed topic analysis by using the unstructured text data that is distributed through news article. As one of the most prevalent applications of topic analysis, the issue tracking technique investigates the changes of the social issues that identified through topic analysis. Currently, traditional issue tracking is conducted by identifying the main topics of documents that cover an entire period at the same time and analyzing the occurrence of each topic by the period of occurrence. However, this traditional issue tracking approach has limitation that it cannot discover dynamic mutation process of complex social issues. The purpose of this study is to overcome the limitations of the existing issue tracking method. We first derived core issues of each period, and then discover the dynamic mutation process of various issues. In this study, we further analyze the mutation process from the perspective of the issues categories, in order to figure out the pattern of issue flow, including the frequency and reliability of the pattern. In other words, this study allows us to understand the components of the complex issues by tracking the dynamic history of issues. This methodology can facilitate a clearer understanding of complex social phenomena by providing mutation history and related category information of the phenomena.Keywords: Data Mining, Issue Tracking, Text Mining, topic Analysis, topic Detection, Trend Detection
Procedia PDF Downloads 40839401 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain
Authors: M. Pushparani, A. Sagaya
Abstract:
Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.Keywords: embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems
Procedia PDF Downloads 28539400 Exploring the Potential of Phase Change Materials in Construction Environments
Authors: A. Ait Ahsene F., B. Boughrara S.
Abstract:
The buildings sector accounts for a significant portion of global energy consumption, with much of this energy used to heat and cool indoor spaces. In this context, the integration of innovative technologies such as phase change materials (PCM) holds promising potential to improve the energy efficiency and thermal comfort of buildings. This research topic explores the benefits and challenges associated with the use of PCMs in buildings, focusing on their ability to store and release thermal energy to regulate indoor temperature. We investigated the different types of PCM available, their thermal properties, and their potential applications in various climate zones and building types. To evaluate and compare the performance of PCMs, our methodology includes a series of laboratory and field experiments. In the laboratory, we measure the thermal storage capacity, melting and solidification temperatures, latent heat, and thermal conductivity of various PCMs. These measurements make it possible to quantify the capacity of each PCM to store and release thermal energy, as well as its capacity to transfer this energy through the construction materials. Additionally, field studies are conducted to evaluate the performance of PCMs in real-world environments. We install PCM systems in real buildings and monitor their operation over time, measuring energy savings, occupant thermal comfort, and material durability. These empirical data allow us to compare the effectiveness of different types of PCMs under real-world use conditions. By combining the results of laboratory and field experiments, we provide a comprehensive analysis of the advantages and limitations of PCMs in buildings, as well as recommendations for their effective application in practice.Keywords: energy saving, phase change materials, material sustainability, buildings sector
Procedia PDF Downloads 4139399 Data Management System for Environmental Remediation
Authors: Elizaveta Petelina, Anton Sizo
Abstract:
Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.Keywords: data management, environmental remediation, geographic information system, GIS, decision making
Procedia PDF Downloads 16139398 Exploring the Potential of Replika: An AI Chatbot for Mental Health Support
Authors: Nashwah Alnajjar
Abstract:
This research paper provides an overview of Replika, an AI chatbot application that uses natural language processing technology to engage in conversations with users. The app was developed to provide users with a virtual AI friend who can converse with them on various topics, including mental health. This study explores the experiences of Replika users using quantitative research methodology. A survey was conducted with 12 participants to collect data on their demographics, usage patterns, and experiences with the Replika app. The results showed that Replika has the potential to play a role in mental health support and well-being.Keywords: Replika, chatbot, mental health, artificial intelligence, natural language processing
Procedia PDF Downloads 8639397 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference
Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov
Abstract:
Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.Keywords: canonical correlation analysis, power curve, power performance, wind energy
Procedia PDF Downloads 33639396 On Board Measurement of Real Exhaust Emission of Light-Duty Vehicles in Algeria
Authors: R. Kerbachi, S. Chikhi, M. Boughedaoui
Abstract:
The study presents an analysis of the Algerian vehicle fleet and resultant emissions. The emission measurement of air pollutants emitted by road transportation (CO, THC, NOX and CO2) was conducted on 17 light duty vehicles in real traffic. This sample is representative of the Algerian light vehicles in terms of fuel quality (gasoline, diesel and liquefied petroleum gas) and the technology quality (injection system and emission control). The experimental measurement methodology of unit emission of vehicles in real traffic situation is based on the use of the mini-Constant Volume Sampler for gas sampling and a set of gas analyzers for CO2, CO, NOx and THC, with an instrumentation to measure kinematics, gas temperature and pressure. The apparatus is also equipped with data logging instrument and data transfer. The results were compared with the database of the European light vehicles (Artemis). It was shown that the technological injection liquefied petroleum gas (LPG) has significant impact on air pollutants emission. Therefore, with the exception of nitrogen oxide compounds, uncatalyzed LPG vehicles are more effective in reducing emissions unit of air pollutants compared to uncatalyzed gasoline vehicles. LPG performance seems to be lower under real driving conditions than expected on chassis dynamometer. On the other hand, the results show that uncatalyzed gasoline vehicles emit high levels of carbon monoxide, and nitrogen oxides. Overall, and in the absence of standards in Algeria, unit emissions are much higher than Euro 3. The enforcement of pollutant emission standard in developing countries is an important step towards introducing cleaner technology and reducing vehicular emissions.Keywords: on-board measurements of unit emissions of CO, HC, NOx and CO2, light vehicles, mini-CVS, LPG-fuel, artemis, Algeria
Procedia PDF Downloads 27539395 Automatic Music Score Recognition System Using Digital Image Processing
Authors: Yuan-Hsiang Chang, Zhong-Xian Peng, Li-Der Jeng
Abstract:
Music has always been an integral part of human’s daily lives. But, for the most people, reading musical score and turning it into melody is not easy. This study aims to develop an Automatic music score recognition system using digital image processing, which can be used to read and analyze musical score images automatically. The technical approaches included: (1) staff region segmentation; (2) image preprocessing; (3) note recognition; and (4) accidental and rest recognition. Digital image processing techniques (e.g., horizontal /vertical projections, connected component labeling, morphological processing, template matching, etc.) were applied according to musical notes, accidents, and rests in staff notations. Preliminary results showed that our system could achieve detection and recognition rates of 96.3% and 91.7%, respectively. In conclusion, we presented an effective automated musical score recognition system that could be integrated in a system with a media player to play music/songs given input images of musical score. Ultimately, this system could also be incorporated in applications for mobile devices as a learning tool, such that a music player could learn to play music/songs.Keywords: connected component labeling, image processing, morphological processing, optical musical recognition
Procedia PDF Downloads 41939394 Expansive-Restrictive Style: Conceptualizing Knowledge Workers
Authors: Ram Manohar Singh, Meenakshi Gupta
Abstract:
Various terms such as ‘learning style’, ‘cognitive style’, ‘conceptual style’, ‘thinking style’, ‘intellectual style’ are used in literature to refer to an individual’s characteristic and consistent approach to organizing and processing information. However, style concepts are criticized for mutually overlapping definitions and confusing classification. This confusion should be addressed at the conceptual as well as empirical level. This paper is an attempt to bridge this gap in literature by proposing a new concept: expansive-restrictive intellectual style based on phenomenological analysis of an auto-ethnography and interview of 26 information technology (IT) professionals working in knowledge intensive organizations (KIOs) in India. Expansive style is an individual’s preference to expand his/her horizon of knowledge and understanding by gaining real meaning and structure of his/her work. On the contrary restrictive style is characterized by an individual’s preference to take minimalist approach at work reflected in executing a job efficiently without an attempt to understand the real meaning and structure of the work. The analysis suggests that expansive-restrictive style has three dimensions: (1) field dependence-independence (2) cognitive involvement and (3) epistemological beliefs.Keywords: expansive, knowledge workers, restrictive, style
Procedia PDF Downloads 42439393 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain
Authors: Bita Payami-Shabestari, Dariush Eslami
Abstract:
The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory
Procedia PDF Downloads 12939392 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data
Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin
Abstract:
The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test
Procedia PDF Downloads 29939391 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform
Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo
Abstract:
The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper introduces an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.Keywords: energy-efficient, fog computing, IoT, telehealth
Procedia PDF Downloads 86