Search results for: multivariate time series data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38836

Search results for: multivariate time series data

35626 IoT Continuous Monitoring Biochemical Oxygen Demand Wastewater Effluent Quality: Machine Learning Algorithms

Authors: Sergio Celaschi, Henrique Canavarro de Alencar, Claaudecir Biazoli

Abstract:

Effluent quality is of the highest priority for compliance with the permit limits of environmental protection agencies and ensures the protection of their local water system. Of the pollutants monitored, the biochemical oxygen demand (BOD) posed one of the greatest challenges. This work presents a solution for wastewater treatment plants - WWTP’s ability to react to different situations and meet treatment goals. Delayed BOD5 results from the lab take 7 to 8 analysis days, hindered the WWTP’s ability to react to different situations and meet treatment goals. Reducing BOD turnaround time from days to hours is our quest. Such a solution is based on a system of two BOD bioreactors associated with Digital Twin (DT) and Machine Learning (ML) methodologies via an Internet of Things (IoT) platform to monitor and control a WWTP to support decision making. DT is a virtual and dynamic replica of a production process. DT requires the ability to collect and store real-time sensor data related to the operating environment. Furthermore, it integrates and organizes the data on a digital platform and applies analytical models allowing a deeper understanding of the real process to catch sooner anomalies. In our system of continuous time monitoring of the BOD suppressed by the effluent treatment process, the DT algorithm for analyzing the data uses ML on a chemical kinetic parameterized model. The continuous BOD monitoring system, capable of providing results in a fraction of the time required by BOD5 analysis, is composed of two thermally isolated batch bioreactors. Each bioreactor contains input/output access to wastewater sample (influent and effluent), hydraulic conduction tubes, pumps, and valves for batch sample and dilution water, air supply for dissolved oxygen (DO) saturation, cooler/heater for sample thermal stability, optical ODO sensor based on fluorescence quenching, pH, ORP, temperature, and atmospheric pressure sensors, local PLC/CPU for TCP/IP data transmission interface. The dynamic BOD system monitoring range covers 2 mg/L < BOD < 2,000 mg/L. In addition to the BOD monitoring system, there are many other operational WWTP sensors. The CPU data is transmitted/received to/from the digital platform, which in turn performs analyses at periodic intervals, aiming to feed the learning process. BOD bulletins and their credibility intervals are made available in 12-hour intervals to web users. The chemical kinetics ML algorithm is composed of a coupled system of four first-order ordinary differential equations for the molar masses of DO, organic material present in the sample, biomass, and products (CO₂ and H₂O) of the reaction. This system is solved numerically linked to its initial conditions: DO (saturated) and initial products of the kinetic oxidation process; CO₂ = H₂0 = 0. The initial values for organic matter and biomass are estimated by the method of minimization of the mean square deviations. A real case of continuous monitoring of BOD wastewater effluent quality is being conducted by deploying an IoT application on a large wastewater purification system located in S. Paulo, Brazil.

Keywords: effluent treatment, biochemical oxygen demand, continuous monitoring, IoT, machine learning

Procedia PDF Downloads 73
35625 Data Science in Military Decision-Making: A Semi-Systematic Literature Review

Authors: H. W. Meerveld, R. H. A. Lindelauf

Abstract:

In contemporary warfare, data science is crucial for the military in achieving information superiority. Yet, to the authors’ knowledge, no extensive literature survey on data science in military decision-making has been conducted so far. In this study, 156 peer-reviewed articles were analysed through an integrative, semi-systematic literature review to gain an overview of the topic. The study examined to what extent literature is focussed on the opportunities or risks of data science in military decision-making, differentiated per level of war (i.e. strategic, operational, and tactical level). A relatively large focus on the risks of data science was observed in social science literature, implying that political and military policymakers are disproportionally influenced by a pessimistic view on the application of data science in the military domain. The perceived risks of data science are, however, hardly addressed in formal science literature. This means that the concerns on the military application of data science are not addressed to the audience that can actually develop and enhance data science models and algorithms. Cross-disciplinary research on both the opportunities and risks of military data science can address the observed research gaps. Considering the levels of war, relatively low attention for the operational level compared to the other two levels was observed, suggesting a research gap with reference to military operational data science. Opportunities for military data science mostly arise at the tactical level. On the contrary, studies examining strategic issues mostly emphasise the risks of military data science. Consequently, domain-specific requirements for military strategic data science applications are hardly expressed. Lacking such applications may ultimately lead to a suboptimal strategic decision in today’s warfare.

Keywords: data science, decision-making, information superiority, literature review, military

Procedia PDF Downloads 167
35624 Management of Acute Biliary Pathology at Gozo General Hospital

Authors: Kristian Bugeja, Upeshala A. Jayawardena, Clarissa Fenech, Mark Zammit Vincenti

Abstract:

Introduction: Biliary colic, acute cholecystitis, and gallstone pancreatitis are some of the most common surgical presentations at Gozo General Hospital (GGH). National Institute for Health and Care Excellence (NICE) guidelines advise that suitable patients with acute biliary problems should be offered a laparoscopic cholecystectomy within one week of diagnosis. There has traditionally been difficulty in achieving this mainly due to the reluctance of some surgeons to operate in the acute setting, limited, timely access to MRCP and ERCP, and organizational issues. Methodology: A retrospective study was performed involving all biliary pathology-related admissions to GGH during the two-year period of 2019 and 2020. Patients’ files and electronic case summary (ECS) were used for data collection, which included demographic data, primary diagnosis, co-morbidities, management, waiting time to surgery, length of stay, readmissions, and reason for readmissions. NICE clinical guidance 188 – Gallstone disease were used as the standard. Results: 51 patients were included in the study. The mean age was 58 years, and 35 (68.6%) were female. The main diagnoses on admission were biliary colic in 31 (60.8%), acute cholecystitis in 10 (19.6%). Others included gallstone pancreatitis in 3 (5.89%), chronic cholecystitis in 2 (3.92%), gall bladder malignancy in 4 (7.84%), and ascending cholangitis in 1 (1.97%). Management included laparoscopic cholecystectomy in 34 (66.7%); conservative in 8 (15.7%) and ERCP in 6 (11.7%). The mean waiting time for laparoscopic cholecystectomy for patients with acute cholecystitis was 74 days – range being between 3 and 146 days since the date of diagnosis. Only one patient who was diagnosed with acute cholecystitis and managed with laparoscopic cholecystectomy was done so within the 7-day time frame. Hospital re-admissions were reported in 5 patients (9.8%) due to vomiting (1), ascending cholangitis (1), and gallstone pancreatitis (3). Discussion: Guidelines were not met for patients presenting to Gozo General Hospital with acute biliary pathology. This resulted in 5 patients being re-admitted to hospital while waiting for definitive surgery. The local issues resulting in the delay to surgery need to be identified and steps are taken to facilitate the provision of urgent cholecystectomy for suitable patients.

Keywords: biliary colic, acute cholecystits, laparoscopic cholecystectomy, conservative management

Procedia PDF Downloads 161
35623 Impact of Crises on Official Statistics: Environmental Statistics at Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf during the COVID-19 Pandemic: A Case Study

Authors: Ibtihaj Al-Siyabi

Abstract:

The crisis of COVID-19 posed enormous challenges to the statistical providers. While official statistics were disrupted by the pandemic and related containment measures, there was a growing and pressing need for real-time data and statistics to inform decisions. This paper gives an account of the way the pandemic impacted the operations of the National Statistical Offices (NSOs) in general in terms of data collection and methods used and the main challenges encountered by them based on international surveys. It highlights the performance of the Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf, GCC-STAT, and its responsiveness to the pandemic placing special emphasis on environmental statistics. The paper concludes by confirming the GCC-STAT’s resilience and success in facing the challenges.

Keywords: NSO, COVID-19, statistics, crisis, pandemic

Procedia PDF Downloads 139
35622 Recovery of Fried Soybean Oil Using Bentonite as an Adsorbent: Optimization, Isotherm and Kinetics Studies

Authors: Prakash Kumar Nayak, Avinash Kumar, Uma Dash, Kalpana Rayaguru

Abstract:

Soybean oil is one of the most widely consumed cooking oils, worldwide. Deep-fat frying of foods at higher temperatures adds unique flavour, golden brown colour and crispy texture to foods. But it brings in various changes like hydrolysis, oxidation, hydrogenation and thermal alteration to oil. The presence of Peroxide value (PV) is one of the most important factors affecting the quality of the deep-fat fried oil. Using bentonite as an adsorbent, the PV can be reduced, thereby improving the quality of the soybean oil. In this study, operating parameters like heating time of oil (10, 15, 20, 25 & 30 h), contact time ( 5, 10, 15, 20, 25 h) and concentration of adsorbent (0.25, 0.5, 0.75, 1.0 and 1.25 g/ 100 ml of oil) have been optimized by response surface methodology (RSM) considering percentage reduction of PV as a response. Adsorption data were analysed by fitting with Langmuir and Freundlich isotherm model. The results show that the Langmuir model shows the best fit compared to the Freundlich model. The adsorption process was also found to follow a pseudo-second-order kinetic model.

Keywords: bentonite, Langmuir isotherm, peroxide value, RSM, soybean oil

Procedia PDF Downloads 375
35621 Legal Regulation of Personal Information Data Transmission Risk Assessment: A Case Study of the EU’s DPIA

Authors: Cai Qianyi

Abstract:

In the midst of global digital revolution, the flow of data poses security threats that call China's existing legislative framework for protecting personal information into question. As a preliminary procedure for risk analysis and prevention, the risk assessment of personal data transmission lacks detailed guidelines for support. Existing provisions reveal unclear responsibilities for network operators and weakened rights for data subjects. Furthermore, the regulatory system's weak operability and a lack of industry self-regulation heighten data transmission hazards. This paper aims to compare the regulatory pathways for data information transmission risks between China and Europe from a legal framework and content perspective. It draws on the “Data Protection Impact Assessment Guidelines” to empower multiple stakeholders, including data processors, controllers, and subjects, while also defining obligations. In conclusion, this paper intends to solve China's digital security shortcomings by developing a more mature regulatory framework and industry self-regulation mechanisms, resulting in a win-win situation for personal data protection and the development of the digital economy.

Keywords: personal information data transmission, risk assessment, DPIA, internet service provider, personal information data transimission, risk assessment

Procedia PDF Downloads 60
35620 Wavelets Contribution on Textual Data Analysis

Authors: Habiba Ben Abdessalem

Abstract:

The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.

Keywords: textual data, wavelet, denoising, contingency table

Procedia PDF Downloads 277
35619 Spectroscopic Characterization Approach to Study Ablation Time on Zinc Oxide Nanoparticles Synthesis by Laser Ablation Technique

Authors: Suha I. Al-Nassar, K. M. Adel, F. Zainab

Abstract:

This work was devoted for producing ZnO nanoparticles by pulsed laser ablation (PLA) of Zn metal plate in the aqueous environment of cetyl trimethyl ammonium bromide (CTAB) using Q-Switched Nd:YAG pulsed laser with wavelength= 1064 nm, Rep. rate= 10 Hz, Pulse duration= 6 ns and laser energy 50 mJ. Solution of nanoparticles is found stable in the colloidal form for a long time. The effect of ablation time on the optical and structure of ZnO was studied is characterized by UV-visible absorption. UV-visible absorption spectrum has four peaks at 256, 259, 265, 322 nm for ablation time (5, 10, 15, and 20 sec) respectively, our results show that UV–vis spectra show a blue shift in the presence of CTAB with decrease the ablation time and blue shift indicated to get smaller size of nanoparticles. The blue shift in the absorption edge indicates the quantum confinement property of nanoparticles. Also, FTIR transmittance spectra of ZnO2 nanoparticles prepared in these states show a characteristic ZnO absorption at 435–445cm^−1.

Keywords: zinc oxide nanoparticles, CTAB solution, pulsed laser ablation technique, spectroscopic characterization

Procedia PDF Downloads 378
35618 Retrospective Demographic Analysis of Patients Lost to Follow-Up from Antiretroviral Therapy in Mulanje Mission Hospital, Malawi

Authors: Silas Webb, Joseph Hartland

Abstract:

Background: Long-term retention of patients on ART has become a major health challenge in Sub-Saharan Africa (SSA). In 2010 a systematic review of 39 papers found that 30% of patients were no longer taking their ARTs two years after starting treatment. In the same review, it was noted that there was a paucity of data as to why patients become lost to follow-up (LTFU) in SSA. This project was performed in Mulanje Mission Hospital in Malawi as part of Swindon Academy’s Global Health eSSC. The HIV prevalence for Malawi is 10.3%, one of the highest rates in the world, however prevalence soars to 18% in the Mulanje. Therefore it is essential that patients at risk of being LTFU are identified early and managed appropriately to help them continue to participate in the service. Methodology: All patients on adult antiretroviral formulations at MMH, who were classified as ‘defaulters’ (patients missing a scheduled follow up visit by more than two months) over the last 12 months were included in the study. Demographic varibales were collected from Mastercards for data analysis. A comparison group of patients currently not lost to follow up was created by using all of the patients who attended the HIV clinic between 18th-22nd July 2016 who had never defaulted from ART. Data was analysed using the chi squared (χ²) test, as data collected was categorical, with alpha levels set at 0.05. Results: Overall, 136 patients had defaulted from ART over the past 12 months at MMH. Of these, 43 patients had missing Mastercards, so 93 defaulter datasets were analysed. In the comparison group 93 datasets were also analysed and statistical analysis done using Chi-Squared testing. A higher proportion of men in the defaulting group was noted (χ²=0.034) and defaulters tended to be younger (χ²=0.052). 94.6% of patients who defaulted were taking Tenofovir, Lamivudine and Efavirenz, the standard first line ART therapy in Malawi. The mean length of time on ART was 39.0 months (RR: -22.4-100.4) in the defaulters group and 47.3 months (RR: -19.71-114.23) in the control group, with a mean difference of 8.3 less months in the defaulters group (χ ²=0.056). Discussion: The findings in this study echo the literature, however this review expands on that and shows the demographic for the patient at most risk of defaulting and being LTFU would be: a young male who has missed more than 4 doses of ART and is within his first year of treatment. For the hospital, this data is important at it identifies significant areas for public health focus. For instance, fear of disclosure and stigma may be disproportionately affecting younger men, so interventions can be aimed specifically at them to improve their health outcomes. The mean length of time on medication was 8.3 months less in the defaulters group, with a p-value of 0.056, emphasising the need for more intensive follow-up in the early stages of treatment, when patients are at the highest risk of defaulting.

Keywords: anti-retroviral therapy, ART, HIV, lost to follow up, Malawi

Procedia PDF Downloads 186
35617 Investigation of Chord Protocol in Peer to Peer Wireless Mesh Network with Mobility

Authors: P. Prasanna Murali Krishna, M. V. Subramanyam, K. Satya Prasad

Abstract:

File sharing in networks are generally achieved using Peer-to-Peer (P2P) applications. Structured P2P approaches are widely used in adhoc networks due to its distributed and scalability features. Efficient mechanisms are required to handle the huge amount of data distributed to all peers. The intrinsic characteristics of P2P system makes for easier content distribution when compared to client-server architecture. All the nodes in a P2P network act as both client and server, thus, distributing data takes lesser time when compared to the client-server method. CHORD protocol is a resource routing based where nodes and data items are structured into a 1- dimensional ring. The structured lookup algorithm of Chord is advantageous for distributed P2P networking applications. Though, structured approach improves lookup performance in a high bandwidth wired network it could contribute to unnecessary overhead in overlay networks leading to degradation of network performance. In this paper, the performance of existing CHORD protocol on Wireless Mesh Network (WMN) when nodes are static and dynamic is investigated.

Keywords: wireless mesh network (WMN), structured P2P networks, peer to peer resource sharing, CHORD Protocol, DHT

Procedia PDF Downloads 480
35616 Analysis of a Discrete-time Geo/G/1 Queue Integrated with (s, Q) Inventory Policy at a Service Facility

Authors: Akash Verma, Sujit Kumar Samanta

Abstract:

This study examines a discrete-time Geo/G/1 queueing-inventory system attached with (s, Q) inventory policy. Assume that the customers follow the Bernoulli process on arrival. Each customer demands a single item with arbitrarily distributed service time. The inventory is replenished by an outside supplier, and the lead time for the replenishment is determined by a geometric distribution. There is a single server and infinite waiting space in this facility. Demands must wait in the specified waiting area during a stock-out period. The customers are served on a first-come-first-served basis. With the help of the embedded Markov chain technique, we determine the joint probability distributions of the number of customers in the system and the number of items in stock at the post-departure epoch using the Matrix Analytic approach. We relate the system length distribution at post-departure and outside observer's epochs to determine the joint probability distribution at the outside observer's epoch. We use probability distributions at random epochs to determine the waiting time distribution. We obtain the performance measures to construct the cost function. The optimum values of the order quantity and reordering point are found numerically for the variety of model parameters.

Keywords: discrete-time queueing inventory model, matrix analytic method, waiting-time analysis, cost optimization

Procedia PDF Downloads 42
35615 The Association of Excessive Work Stress with Job Satisfaction and Turnover Intention in Operating Room Nurses: A Cross-Sectional Study in a Metropolitan Teaching Hospital in Southern Taiwan

Authors: Chia Yu Chen, Shu Fen Wu, Chen-Fuh Lam, I-Ling Tsai, Shu Jiuan Chen, Yen Ling Liu

Abstract:

Aim: It remains undetermined that whether increased work stress may affect the job satisfaction and career loyalty among nursing staffs in the operating room. The long-term goal of this study is to lengthen the professional life of operating room nurses by attenuating the work stress and enhancing their contentment in work. Method: This was a cross-sectional, descriptive study performed in a metropolitan teaching hospital in the southern Taiwan between May 2017 to July 2017. A structured self-administered questionnaire, modified from the Occupational Stress Indicator-2 (OSI-2) and Maslach Burnout Inventory (MBI) manual was collected from the operating room nurses. Chi-square test was used to analyze the categorical data and Pearson correlation was used to analyze the association between two numerical datasets (SPSS version 20.0). Results: The response rate was 80% (80/100) and a total of 73 (73%) completed forms were eventually proceeded for analysis. The average scores for work stress and job satisfaction of the operating room nurses were 145.96±32.91 and 47.38±6.07, respectively. The correlation coefficients of work stress versus job satisfaction and organizational identity were (r=-0.338, p=0.003 and r=-0.354, p=0.002), respectively. There were more nurses who took rotating shift quitted works from the operating room than those who took only dayshift (2=5.176, p<0.05). Nurses who reported of having lower job satisfaction were associated with significantly higher turnover intention (t=3.714, p< 0.01). Following multivariate regression analysis, rotating shift and low job satisfaction were identified as the two independent predictors of intention to quit from working in the operating room. Conclusion: Our study clearly demonstrates that increased work stress significantly attenuates job satisfaction and organizational identity. Rotating shift is associated with higher work stress, lower job satisfaction, and higher turnover intention, which is consistent with the previous surveys carried out in the department of medical technology. Therefore, improvement of working quality in the operating rooms is essential to increase the retain intention of the well-trained nursing staffs. Further investigation into types of work shifts and other strategies of attenuating stress in workplace is currently undertaken in order to improve the job satisfaction and to decrease turnover intention in the operating room.

Keywords: rotating shift, work stress, job satisfaction, turnover intention

Procedia PDF Downloads 197
35614 Graph-Based Semantical Extractive Text Analysis

Authors: Mina Samizadeh

Abstract:

In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.

Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis

Procedia PDF Downloads 70
35613 Customer Churn Analysis in Telecommunication Industry Using Data Mining Approach

Authors: Burcu Oralhan, Zeki Oralhan, Nilsun Sariyer, Kumru Uyar

Abstract:

Data mining has been becoming more and more important and a wide range of applications in recent years. Data mining is the process of find hidden and unknown patterns in big data. One of the applied fields of data mining is Customer Relationship Management. Understanding the relationships between products and customers is crucial for every business. Customer Relationship Management is an approach to focus on customer relationship development, retention and increase on customer satisfaction. In this study, we made an application of a data mining methods in telecommunication customer relationship management side. This study aims to determine the customers profile who likely to leave the system, develop marketing strategies, and customized campaigns for customers. Data are clustered by applying classification techniques for used to determine the churners. As a result of this study, we will obtain knowledge from international telecommunication industry. We will contribute to the understanding and development of this subject in Customer Relationship Management.

Keywords: customer churn analysis, customer relationship management, data mining, telecommunication industry

Procedia PDF Downloads 316
35612 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis

Authors: N. R. N. Idris, S. Baharom

Abstract:

A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.

Keywords: aggregate data, combined-level data, individual patient data, meta-analysis

Procedia PDF Downloads 375
35611 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 359
35610 GeneNet: Temporal Graph Data Visualization for Gene Nomenclature and Relationships

Authors: Jake Gonzalez, Tommy Dang

Abstract:

This paper proposes a temporal graph approach to visualize and analyze the evolution of gene relationships and nomenclature over time. An interactive web-based tool implements this temporal graph, enabling researchers to traverse a timeline and observe coupled dynamics in network topology and naming conventions. Analysis of a real human genomic dataset reveals the emergence of densely interconnected functional modules over time, representing groups of genes involved in key biological processes. For example, the antimicrobial peptide DEFA1A3 shows increased connections to related alpha-defensins involved in infection response. Tracking degree and betweenness centrality shifts over timeline iterations also quantitatively highlight the reprioritization of certain genes’ topological importance as knowledge advances. Examination of the CNR1 gene encoding the cannabinoid receptor CB1 demonstrates changing synonymous relationships and consolidating naming patterns over time, reflecting its unique functional role discovery. The integrated framework interconnecting these topological and nomenclature dynamics provides richer contextual insights compared to isolated analysis methods. Overall, this temporal graph approach enables a more holistic study of knowledge evolution to elucidate complex biology.

Keywords: temporal graph, gene relationships, nomenclature evolution, interactive visualization, biological insights

Procedia PDF Downloads 61
35609 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain

Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami

Abstract:

To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. In the blockchain mechanism such as Bitcoin using PKI (Public Key Infrastructure), in order to confirm the identity of the company that has sent the data, the plaintext must be shared between the companies. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is a top secret. In this scenario, we show a implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.

Keywords: business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption

Procedia PDF Downloads 136
35608 Analysing Trends in Rice Cropping Intensity and Seasonality across the Philippines Using 14 Years of Moderate Resolution Remote Sensing Imagery

Authors: Bhogendra Mishra, Andy Nelson, Mirco Boschetti, Lorenzo Busetto, Alice Laborte

Abstract:

Rice is grown on over 100 million hectares in almost every country of Asia. It is the most important staple crop for food security and has high economic and cultural importance in Asian societies. The combination of genetic diversity and management options, coupled with the large geographic extent means that there is a large variation in seasonality (when it is grown) and cropping intensity (how often it is grown per year on the same plot of land), even over relatively small distances. Seasonality and intensity can and do change over time depending on climatic, environmental and economic factors. Detecting where and when these changes happen can provide information to better understand trends in regional and even global rice production. Remote sensing offers a unique opportunity to estimate these trends. We apply the recently published PhenoRice algorithm to 14 years of moderate resolution remote sensing (MODIS) data (utilizing 250m resolution 16 day composites from Terra and Aqua) to estimate seasonality and cropping intensity per year and changes over time. We compare the results to the surveyed data collected by International Rice Research Institute (IRRI). The study results in a unique and validated dataset on the extent and change of extent, the seasonality and change in seasonality and the cropping intensity and change in cropping intensity between 2003 and 2016 for the Philippines. Observed trends and their implications for food security and trade policies are also discussed.

Keywords: rice, cropping intensity, moderate resolution remote sensing (MODIS), phenology, seasonality

Procedia PDF Downloads 306
35607 Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform

Authors: Dogaru-Ulieru Valentin, Sălișteanu Ioan Corneliu, Ardeleanu Mihăiță Nicolae, Broscăreanu Ștefan, Sălișteanu Bogdan, Mihai Mihail

Abstract:

The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day.

Keywords: DiaMOTO, Codec-8, ADAS, GPS, driver monitoring

Procedia PDF Downloads 78
35606 Optimization by Means of Genetic Algorithm of the Equivalent Electrical Circuit Model of Different Order for Li-ion Battery Pack

Authors: V. Pizarro-Carmona, S. Castano-Solis, M. Cortés-Carmona, J. Fraile-Ardanuy, D. Jimenez-Bermejo

Abstract:

The purpose of this article is to optimize the Equivalent Electric Circuit Model (EECM) of different orders to obtain greater precision in the modeling of Li-ion battery packs. Optimization includes considering circuits based on 1RC, 2RC and 3RC networks, with a dependent voltage source and a series resistor. The parameters are obtained experimentally using tests in the time domain and in the frequency domain. Due to the high non-linearity of the behavior of the battery pack, Genetic Algorithm (GA) was used to solve and optimize the parameters of each EECM considered (1RC, 2RC and 3RC). The objective of the estimation is to minimize the mean square error between the measured impedance in the real battery pack and those generated by the simulation of different proposed circuit models. The results have been verified by comparing the Nyquist graphs of the estimation of the complex impedance of the pack. As a result of the optimization, the 2RC and 3RC circuit alternatives are considered as viable to represent the battery behavior. These battery pack models are experimentally validated using a hardware-in-the-loop (HIL) simulation platform that reproduces the well-known New York City cycle (NYCC) and Federal Test Procedure (FTP) driving cycles for electric vehicles. The results show that using GA optimization allows obtaining EECs with 2RC or 3RC networks, with high precision to represent the dynamic behavior of a battery pack in vehicular applications.

Keywords: Li-ion battery packs modeling optimized, EECM, GA, electric vehicle applications

Procedia PDF Downloads 123
35605 Data Collection Techniques for Robotics to Identify the Facial Expressions of Traumatic Brain Injured Patients

Authors: Chaudhary Muhammad Aqdus Ilyas, Matthias Rehm, Kamal Nasrollahi, Thomas B. Moeslund

Abstract:

This paper presents the investigation of data collection procedures, associated with robots when placed with traumatic brain injured (TBI) patients for rehabilitation purposes through facial expression and mood analysis. Rehabilitation after TBI is very crucial due to nature of injury and variation in recovery time. It is advantageous to analyze these emotional signals in a contactless manner, due to the non-supportive behavior of patients, limited muscle movements and increase in negative emotional expressions. This work aims at the development of framework where robots can recognize TBI emotions through facial expressions to perform rehabilitation tasks by physical, cognitive or interactive activities. The result of these studies shows that with customized data collection strategies, proposed framework identify facial and emotional expressions more accurately that can be utilized in enhancing recovery treatment and social interaction in robotic context.

Keywords: computer vision, convolution neural network- long short term memory network (CNN-LSTM), facial expression and mood recognition, multimodal (RGB-thermal) analysis, rehabilitation, robots, traumatic brain injured patients

Procedia PDF Downloads 155
35604 Dynamic Analysis of the Heat Transfer in the Magnetically Assisted Reactor

Authors: Tomasz Borowski, Dawid Sołoducha, Rafał Rakoczy, Marian Kordas

Abstract:

The application of magnetic field is essential for a wide range of technologies or processes (i.e., magnetic hyperthermia, bioprocessing). From the practical point of view, bioprocess control is often limited to the regulation of temperature at constant values favourable to microbial growth. The main aim of this study is to determine the effect of various types of electromagnetic fields (i.e., static or alternating) on the heat transfer in a self-designed magnetically assisted reactor. The experimental set-up is equipped with a measuring instrument which controlled the temperature of the liquid inside the container and supervised the real-time acquisition of all the experimental data coming from the sensors. Temperature signals are also sampled from generator of magnetic field. The obtained temperature profiles were mathematically described and analyzed. The parameters characterizing the response to a step input of a first-order dynamic system were obtained and discussed. For example, the higher values of the time constant means slow signal (in this case, temperature) increase. After the period equal to about five-time constants, the sample temperature nearly reached the asymptotic value. This dynamical analysis allowed us to understand the heating effect under the action of various types of electromagnetic fields. Moreover, the proposed mathematical description can be used to compare the influence of different types of magnetic fields on heat transfer operations.

Keywords: heat transfer, magnetically assisted reactor, dynamical analysis, transient function

Procedia PDF Downloads 171
35603 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 148
35602 A Predictive Model of Supply and Demand in the State of Jalisco, Mexico

Authors: M. Gil, R. Montalvo

Abstract:

Business Intelligence (BI) has become a major source of competitive advantages for firms around the world. BI has been defined as the process of data visualization and reporting for understanding what happened and what is happening. Moreover, BI has been studied for its predictive capabilities in the context of trade and financial transactions. The current literature has identified that BI permits managers to identify market trends, understand customer relations, and predict demand for their products and services. This last capability of BI has been of special concern to academics. Specifically, due to its power to build predictive models adaptable to specific time horizons and geographical regions. However, the current literature of BI focuses on predicting specific markets and industries because the impact of such predictive models was relevant to specific industries or organizations. Currently, the existing literature has not developed a predictive model of BI that takes into consideration the whole economy of a geographical area. This paper seeks to create a predictive model of BI that would show the bigger picture of a geographical area. This paper uses a data set from the Secretary of Economic Development of the state of Jalisco, Mexico. Such data set includes data from all the commercial transactions that occurred in the state in the last years. By analyzing such data set, it will be possible to generate a BI model that predicts supply and demand from specific industries around the state of Jalisco. This research has at least three contributions. Firstly, a methodological contribution to the BI literature by generating the predictive supply and demand model. Secondly, a theoretical contribution to BI current understanding. The model presented in this paper incorporates the whole picture of the economic field instead of focusing on a specific industry. Lastly, a practical contribution might be relevant to local governments that seek to improve their economic performance by implementing BI in their policy planning.

Keywords: business intelligence, predictive model, supply and demand, Mexico

Procedia PDF Downloads 123
35601 Runtime Monitoring Using Policy-Based Approach to Control Information Flow for Mobile Apps

Authors: Mohamed Sarrab, Hadj Bourdoucen

Abstract:

Mobile applications are verified to check the correctness or evaluated to check the performance with respect to specific security properties such as availability, integrity, and confidentiality. Where they are made available to the end users of the mobile application is achievable only to a limited degree using software engineering static verification techniques. The more sensitive the information, such as credit card data, personal medical information or personal emails being processed by mobile application, the more important it is to ensure the confidentiality of this information. Monitoring non-trusted mobile application during execution in an environment where sensitive information is present is difficult and unnerving. The paper addresses the issue of monitoring and controlling the flow of confidential information during non-trusted mobile application execution. The approach concentrates on providing a dynamic and usable information security solution by interacting with the mobile users during the run-time of mobile application in response to information flow events.

Keywords: mobile application, run-time verification, usable security, direct information flow

Procedia PDF Downloads 381
35600 Experimental Study on Ultrasonic Shot Peening Forming and Surface Properties of AALY12

Authors: Shi-hong Lu, Chao-xun Liu, Yi-feng Zhu

Abstract:

Ultrasonic shot peening (USP) on AALY12 sheet was studied. Several parameters (arc heights, surface roughness, surface topography and microhardness) with different USP process parameters were measured. The research proposes that the radius of curvature of shot peened sheet increases with time and electric current decreasing, while it increases with pin diameter increasing, and radius of curvature reaches a saturation level after a specific processing time and electric current. An empirical model of the relationship between radius of curvature and pin diameter, electric current, time was also obtained. The research shows that the increment of surface and vertical microhardness of material is more obvious with longer time and higher value of electric current, which can be up to 20% and 28% respectively.

Keywords: USP forming, surface properties, radius of curvature, residual stress

Procedia PDF Downloads 517
35599 Research on Level Adjusting Mechanism System of Large Space Environment Simulator

Authors: Han Xiao, Zhang Lei, Huang Hai, Lv Shizeng

Abstract:

Space environment simulator is a device for spacecraft test. KM8 large space environment simulator built in Tianjing Space City is the largest as well as the most advanced space environment simulator in China. Large deviation of spacecraft level will lead to abnormally work of the thermal control device in spacecraft during the thermal vacuum test. In order to avoid thermal vacuum test failure, level adjusting mechanism system is developed in the KM8 large space environment simulator as one of the most important subsystems. According to the level adjusting requirements of spacecraft’s thermal vacuum tests, the four fulcrums adjusting model is established. By means of collecting level instruments and displacement sensors data, stepping motors controlled by PLC drive four supporting legs simultaneous movement. In addition, a PID algorithm is used to control the temperature of supporting legs and level instruments which long time work under the vacuum cold and black environment in KM8 large space environment simulator during thermal vacuum tests. Based on the above methods, the data acquisition and processing, the analysis and calculation, real time adjustment and fault alarming of the level adjusting mechanism system are implemented. The level adjusting accuracy reaches 1mm/m, and carrying capacity is 20 tons. Debugging showed that the level adjusting mechanism system of KM8 large space environment simulator can meet the thermal vacuum test requirement of the new generation spacecraft. The performance and technical indicators of the level adjusting mechanism system which provides important support for the development of spacecraft in China have been ahead of similar equipment in the world.

Keywords: space environment simulator, thermal vacuum test, level adjusting, spacecraft, parallel mechanism

Procedia PDF Downloads 247
35598 Assessing the Self-Directed Learning Skills of the Undergraduate Nursing Students in a Medical University in Bahrain: A Quantitative Study

Authors: Catherine Mary Abou-Zaid

Abstract:

This quantitative study discusses the concerns with the self-directed learning (SDL) skills of the undergraduate nursing students in a medical university in Bahrain. The nursing undergraduate student SDL study was conducted taking all 4 years and compiling data collected from the students themselves by survey questionnaire. The aim of the study is to understand and change the attitudes of self-directed learning among the undergraduate students. The SDL of the undergraduate student nurses has been noticed to be lacking and motivation to actually perform without supervision while out-with classrooms are very low. Their use of the resources available on the virtual learning environment and also within the university is not as good as it should be for a university student at this level. They do not use them to their own advantage. They are not prepared for the transition from high school to an academic environment such as a university or college. For some students it is the first time in their academic lives that they have faced sharing a classroom with the opposite sex. For some this is a major issue and we as academics need to be aware of all issues that they come to higher education with. Design Methodology: The design methodology that was chosen was a quantitative design using convenience sampling of the students who would be asked to complete survey questionnaire. This sampling method was chosen because of the time constraint. This was completed by the undergraduate students themselves while in class. The questionnaire was analyzed by the statistical package for social sciences (SPSS), the results interpreted by the researcher and the findings published in the paper. The analyzed data will also be reported on and from this information we as educators will be able to see the student’s weaknesses regarding self-directed learning. The aims and objectives of the research will be used as recommendations for the improvement of resources for the students to improve their SDL skills. Conclusion: The results will be able to give the educators an insight to how we can change the self-directed learning techniques of the students and enable them to embrace the skills and to focus more on being self-directed in their studies rather than having to be put on to a SDL pathway from the educators themselves. This evidence will come from the analysis of the statistical data. It may even change the way in which the students are selected for the nursing programme. These recommendations will be reported to the head of school and also to the nursing faculty.

Keywords: self-directed learning, undergraduate students, transition, statistical package for social sciences (SPSS), higher education

Procedia PDF Downloads 315
35597 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 142