Search results for: operational data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26150

Search results for: operational data

25310 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 388
25309 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel

Authors: F. M. Pisano, M. Ciminello

Abstract:

Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.

Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics

Procedia PDF Downloads 125
25308 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 516
25307 Who Am I at Work: Work Identity Formation

Authors: Carol Belle-Hallsworth

Abstract:

Human interaction at work evolves over time and, with it, work identity. The social identity is built upon the development of its underpinning and preceding stages. Work identity can be viewed in the same way and will shift based on changes in the work environment and challenges to the work identity (threats to the four stages). This paper provides an analysis of how the stages of trust, autonomy, industry and initiative are related to the employee identity at work. Describing how they are related to each other and the development of identity. It has become common to notice changes in employee behavior during and after major operational changes in an organization. Previous studies suggest that there are emotional triggers that result in the new behaviors displayed. This study seeks to test a theoretical model by testing the relationship between the first four Erikson stages as constructs. A randomized sample of participants undertook a self-administered survey to capture information on trust, autonomy, initiative, and industry.

Keywords: work identity, change management, organizational management, technology implementation

Procedia PDF Downloads 307
25306 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 379
25305 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 94
25304 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 491
25303 The Effect of Culture and Managerial Practices on Organizational Leadership Towards Performance

Authors: Anyia Nduka, Aslan Bin Amad Senin, Ayu Azrin Bte Abdul Aziz

Abstract:

A management practice characterised by a value chain as its relatively flexible culture is replacing the old bureaucratic model of organisational practice that was built on dominance. Using a management practice fruition paradigm, the study delves into the implications of organisational culture and leadership. Developing a theory of leadership called the “cultural model” of organisational leadership by explaining how the shift from bureaucracy to management practises altered the roles and interactions of leaders. This model is well-grounded in leadership theory, considering the concept's adaptability to different leadership ideologies. In organisations where operational procedures and borders are not clearly defined, hierarchies are flattened, and work collaborations are sometimes based on contracts rather than employment. This cultural model of organizational leadership is intended to be a useful tool for predicting how effectively a leader will perform.

Keywords: leadership, organizational culture, management practices, efficiency

Procedia PDF Downloads 84
25302 Modelling of the Fire Pragmatism in the Area of Military Management and Its Experimental Verification

Authors: Ivana Mokrá

Abstract:

The article deals with modelling of the fire pragmatism in the area of military management and its experimental verification. Potential approaches are based on the synergy of mathematical and theoretical ideas, operational and tactical requirements and the military decision-making process. This issue has taken on importance in recent times, particularly with the increasing trend of digitized battlefield, the development of C4ISR systems and intention to streamline the command and control process at the lowest levels of command. From fundamental and philosophical point of view, these new approaches seek to significantly upgrade and enhance the decision-making process of the tactical commanders.

Keywords: military management, decision-making process, strike modeling, experimental evaluation, pragmatism, tactical strike modeling

Procedia PDF Downloads 390
25301 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 245
25300 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 482
25299 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 304
25298 Utilization of Process Mapping Tool to Enhance Production Drilling in Underground Metal Mining Operations

Authors: Sidharth Talan, Sanjay Kumar Sharma, Eoin Joseph Wallace, Nikita Agrawal

Abstract:

Underground mining is at the core of rapidly evolving metals and minerals sector due to the increasing mineral consumption globally. Even though the surface mines are still more abundant on earth, the scales of industry are slowly tipping towards underground mining due to rising depth and complexities of orebodies. Thus, the efficient and productive functioning of underground operations depends significantly on the synchronized performance of key elements such as operating site, mining equipment, manpower and mine services. Production drilling is the process of conducting long hole drilling for the purpose of charging and blasting these holes for the production of ore in underground metal mines. Thus, production drilling is the crucial segment in the underground metal mining value chain. This paper presents the process mapping tool to evaluate the production drilling process in the underground metal mining operation by dividing the given process into three segments namely Input, Process and Output. The three segments are further segregated into factors and sub-factors. As per the study, the major input factors crucial for the efficient functioning of production drilling process are power, drilling water, geotechnical support of the drilling site, skilled drilling operators, services installation crew, oils and drill accessories for drilling machine, survey markings at drill site, proper housekeeping, regular maintenance of drill machine, suitable transportation for reaching the drilling site and finally proper ventilation. The major outputs for the production drilling process are ore, waste as a result of dilution, timely reporting and investigation of unsafe practices, optimized process time and finally well fragmented blasted material within specifications set by the mining company. The paper also exhibits the drilling loss matrix, which is utilized to appraise the loss in planned production meters per day in a mine on account of availability loss in the machine due to breakdowns, underutilization of the machine and productivity loss in the machine measured in drilling meters per unit of percussion hour with respect to its planned productivity for the day. The given three losses would be essential to detect the bottlenecks in the process map of production drilling operation so as to instigate the action plan to suppress or prevent the causes leading to the operational performance deficiency. The given tool is beneficial to mine management to focus on the critical factors negatively impacting the production drilling operation and design necessary operational and maintenance strategies to mitigate them. 

Keywords: process map, drilling loss matrix, SIPOC, productivity, percussion rate

Procedia PDF Downloads 216
25297 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 505
25296 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 430
25295 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 291
25294 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India

Authors: Anushtha Saxena

Abstract:

This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.

Keywords: data monetization, e-commerce companies, regulatory framework, GDPR

Procedia PDF Downloads 120
25293 Design and Control Algorithms for Power Electronic Converters for EV Applications

Authors: Ilya Kavalchuk, Mehdi Seyedmahmoudian, Ben Horan, Aman Than Oo, Alex Stojcevski

Abstract:

The power electronic components within Electric Vehicles (EV) need to operate in several important modes. Some modes directly influence safety, while others influence vehicle performance. Given the variety of functions and operational modes required of the power electronics, it needs to meet efficiency requirements to minimize power losses. Another challenge in the control and construction of such systems is the ability to support bidirectional power flow. This paper considers the construction, operation, and feasibility of available converters for electric vehicles with feasible configurations of electrical buses and loads. This paper describes logic and control signals for the converters for different operations conditions based on the efficiency and energy usage bases.

Keywords: electric vehicles, electrical machines control, power electronics, powerflow regulations

Procedia PDF Downloads 561
25292 QIP: Introducing a Dedicated Ozurdex Clinic

Authors: Vaisnavy Govindasamy, Saba Ishrat

Abstract:

Introduction: The Dexamethasone Intravitreal Implant 0.7 mg (OzurdexTM, Allergan®) is a biodegradable corticosteroid implant approved by the FDA for managing diabetic macular edema (DMO), macular edema following branch retinal vein occlusion (BRVO) or central retinal vein occlusion (CRVO), and posterior segment non-infectious uveitis. This implant can release dexamethasone over a six-month period, exhibiting peak effectiveness between 60 and 90 days post-administration. The intravitreal injection should be performed under sterile conditions. At James Cook University Hospital (JCUH), Ozurdex injections are currently administered in the Vitreo-Retinal (VR) theatre. This study aimed to evaluate the feasibility and potential advantages of establishing a dedicated clinic for Ozurdex administration separate from the VR theatre setting. Method: Retrospectively, data of all Ozurdex injections administered between October 2021 to October 2022 was collected from operating theatre registers at JCUH. Data pertaining to the indications for Ozurdex; waiting times from referral date to date of injection; duration of theatre time consumed; and post-injection complications were collected from electronic notes. The resources needed to establish a dedicated Ozurdex clinic were evaluated. Over a six-month period from October 2023 to March 2024, we gathered data on utilization of theatre 28. Results: A total of 135 Ozurdex injections were administered. Among the indications, uveitis represented 47.3% of cases, DMO with 23.6% and RVO with 22.9%. Remaining cases lacked sufficient data. Each Ozurdex injection procedure consumed 15 minutes in the VR theatre list. Complications arose in 5% of injections, totaling 7 cases. These included glaucoma, ocular hypertension, subconjunctival haemorrhage and implant migration. Waiting times averaged 6 weeks from date for referral to procedure date. We also found that, on an average theatre 28 was offered but remained unused for 4 days, totalling eight sessions in a month. Analysis: Establishing a sperate Ozurdex clinic would improve the quality of patient care in following ways: 1.Decrease injection waiting times (currently averaging 6 weeks), leading to better visual outcomes. 2.Free up approximately three hours of theatre time in Vitreo-Retina theatres each month, allowing for 3-4 additional surgeries. Reduce waiting times for critical retinal surgeries and enhance visual outcomes. 3.Provide additional training opportunities for trainees and retina fellows, improving their skills. 4.Optimize the use of empty theatre slots (theatre 28) currently experiencing underutilization of resources. Conclusion: These findings support the implementation of a separate clinic for administering Ozurdex injections at JCUH. It is evident that introducing a dedicated clinic will enhance operational efficiency, optimise resource utilsation, and improve overall quality of care for patients undergoing this treatment.

Keywords: opthalmology, ozurdex, efficiency, complication

Procedia PDF Downloads 22
25291 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 200
25290 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection

Procedia PDF Downloads 92
25289 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 20
25288 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads

Authors: Dražen Cvitanić, Biljana Maljković

Abstract:

This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.

Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency

Procedia PDF Downloads 452
25287 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 275
25286 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast

Authors: Ruixia Liu

Abstract:

Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.

Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI

Procedia PDF Downloads 234
25285 The Quest for Institutional Independence to Advance Police Pluralism in Ethiopia

Authors: Demelash Kassaye Debalkie

Abstract:

The primary objective of this study is to report the tributes that are significantly impeding the Ethiopian police's ability to provide quality services to the people. Policing in Ethiopia started in the medieval period. However, modern policing was introduced instead of vigilantism in the early 1940s. The progress counted since the date police became modernized is, however, under contention when viewed from the standpoint of officers’ development and technologies in the 21st century. The police in Ethiopia are suffering a lot to be set free from any form of political interference by the government and to be loyal to impartiality, equity, and justice in enforcing the law. Moreover, the institutional competence of the police in Ethiopia is currently losing its power derived from the constitution as a legitimate enforcement agency due to the country’s political landscape encouraging ethnic-based politics. According to studies, the impact of ethnic politics has been a significant challenge for police in controlling conflicts between two ethnic groups. The study used qualitative techniques and data was gathered from key informants selected purposely. The findings indicate that governments in the past decades were skeptical about establishing a constitutional police force in the country. This has certainly been one of the challenges of pluralizing the police: building police-community relations based on trust. The study conducted to uncover the obstructions has finally reported that the government’s commitment to form a non-partisan, functionally decentralized, and operationally demilitarized police force is too minimal and appalling. They mainly intend to formulate the missions of the police in accordance with their interests and political will to remain in power. It, therefore, reminds the policymakers, law enforcement officials, and the government in power to revise its policies and working procedures already operational to strengthen the police in Ethiopia based on public participation and engagement.

Keywords: community, constitution, Ethiopia, law enforcement

Procedia PDF Downloads 87
25284 Application of Multidimensional Model of Evaluating Organisational Performance in Moroccan Sport Clubs

Authors: Zineb Jibraili, Said Ouhadi, Jorge Arana

Abstract:

Introduction: Organizational performance is recognized by some theorists as one-dimensional concept, and by others as multidimensional. This concept, which is already difficult to apply in traditional companies, is even harder to identify, to measure and to manage when voluntary organizations are concerned, essentially because of the complexity of that form of organizations such as sport clubs who are characterized by the multiple goals and multiple constituencies. Indeed, the new culture of professionalization and modernization around organizational performance emerges new pressures from the state, sponsors, members and other stakeholders which have required these sport organizations to become more performance oriented, or to build their capacity in order to better manage their organizational performance. The evaluation of performance can be made by evaluating the input (e.g. available resources), throughput (e.g. processing of the input) and output (e.g. goals achieved) of the organization. In non-profit organizations (NPOs), questions of performance have become increasingly important in the world of practice. To our knowledge, most of studies used the same methods to evaluate the performance in NPSOs, but no recent study has proposed a club-specific model. Based on a review of the studies that specifically addressed the organizational performance (and effectiveness) of NPSOs at operational level, the present paper aims to provide a multidimensional framework in order to understand, analyse and measure organizational performance of sport clubs. This paper combines all dimensions founded in literature and chooses the most suited of them to our model that we will develop in Moroccan sport clubs case. Method: We propose to implicate our unified model of evaluating organizational performance that takes into account all the limitations found in the literature. On a sample of Moroccan sport clubs ‘Football, Basketball, Handball and Volleyball’, for this purpose we use a qualitative study. The sample of our study comprises data from sport clubs (football, basketball, handball, volleyball) participating on the first division of the professional football league over the period from 2011 to 2016. Each football club had to meet some specific criteria in order to be included in the sample: 1. Each club must have full financial data published in their annual financial statements, audited by an independent chartered accountant. 2. Each club must have sufficient data. Regarding their sport and financial performance. 3. Each club must have participated at least once in the 1st division of the professional football league. Result: The study showed that the dimensions that constitute the model exist in the field with some small modifications. The correlations between the different dimensions are positive. Discussion: The aim of this study is to test the unified model emerged from earlier and narrower approaches for Moroccan case. Using the input-throughput-output model for the sketch of efficiency, it was possible to identify and define five dimensions of organizational effectiveness applied to this field of study.

Keywords: organisational performance, model multidimensional, evaluation organizational performance, sport clubs

Procedia PDF Downloads 325
25283 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies

Authors: Sook Ching Yee, Angela Siew Hoong Lee

Abstract:

Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.

Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)

Procedia PDF Downloads 363
25282 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 498
25281 Security in Resource Constraints Network Light Weight Encryption for Z-MAC

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless sensor network was formed by a combination of nodes, systematically it transmitting the data to their base stations, this transmission data can be easily compromised if the limited processing power and the data consistency from these nodes are kept in mind; there is always a discussion to address the secure data transfer or transmission in actual time. This will present a mechanism to securely transmit the data over a chain of sensor nodes without compromising the throughput of the network by utilizing available battery resources available in the sensor node. Our methodology takes many different advantages of Z-MAC protocol for its efficiency, and it provides a unique key by sharing the mechanism using neighbor node MAC address. We present a light weighted data integrity layer which is embedded in the Z-MAC protocol to prove that our protocol performs well than Z-MAC when we introduce the different attack scenarios.

Keywords: hybrid MAC protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node dataprocessing, Z-MAC

Procedia PDF Downloads 144