Search results for: continuous data
24964 The SEMONT Monitoring and Risk Assessment of Environmental EMF Pollution
Authors: Dragan Kljajic, Nikola Djuric, Karolina Kasas-Lazetic, Danka Antic
Abstract:
Wireless communications have been expanded very fast in recent decades. This technology relies on an extensive network of base stations and antennas, using radio frequency signals to transmit information. Devices that use wireless communication, while offering various services, basically act as sources of non-ionizing electromagnetic fields (EMF). Such devices are permanently present in the human vicinity and almost constantly radiate, causing EMF pollution of the environment. This fact has initiated development of modern systems for observation of the EMF pollution, as well as for risk assessment. This paper presents the Serbian electromagnetic field monitoring network – SEMONT, designed for automated, remote and continuous broadband monitoring of EMF in the environment. Measurement results of the SEMONT monitoring at one of the test locations, within the main campus of the University of Novi Sad, are presented and discussed, along with corresponding exposure assessment of the general population, regarding the Serbian legislation.Keywords: EMF monitoring, exposure assessment, sensor nodes, wireless network
Procedia PDF Downloads 26424963 Optimizing Microgrid Operations: A Framework of Adaptive Model Predictive Control
Authors: Ruben Lopez-Rodriguez
Abstract:
In a microgrid, diverse energy sources (both renewable and non-renewable) are combined with energy storage units to form a localized power system. Microgrids function as independent entities, capable of meeting the energy needs of specific areas or communities. This paper introduces a Model Predictive Control (MPC) approach tailored for grid-connected microgrids, aiming to optimize their operation. The formulation employs Mixed-Integer Programming (MIP) to find optimal trajectories. This entails the fulfillment of continuous and binary constraints, all while accounting for commutations between various operating conditions such as storage unit charge/discharge, import/export from/towards the main grid, as well as asset connection/disconnection. To validate the proposed approach, a microgrid case study is conducted, and the simulation results are compared with those obtained using a rule-based strategy.Keywords: microgrids, mixed logical dynamical systems, mixed-integer optimization, model predictive control
Procedia PDF Downloads 5324962 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency
Authors: Rania Alshikhe, Vinita Jindal
Abstract:
Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from traveling vehicles, such as taxis through installed global positioning system (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE
Procedia PDF Downloads 15724961 Design of Collaborative Web System: Based on Case Study of PBL Support Systems
Authors: Kawai Nobuaki
Abstract:
This paper describes the design and implementation of web system for continuable and viable collaboration. This study proposes the improvement of the system based on a result of a certain practice. As contemporary higher education information environments transform, this study highlights the significance of university identity and college identity that are formed continuously through independent activities of the students. Based on these discussions, the present study proposes a practical media environment design which facilitates the processes of organizational identity formation based on a continuous and cyclical model. Even if users change by this system, the communication system continues operation and cooperation. The activity becomes the archive and produces new activity. Based on the result, this study elaborates a plan with a re-design by a system from the viewpoint of second-order cybernetics. Systems theory is a theoretical foundation for our study.Keywords: collaborative work, learning management system, second-order cybernetics, systems theory, user generated contents, viable system model
Procedia PDF Downloads 21124960 Mitigating Supply Chain Risk for Sustainability Using Big Data Knowledge: Evidence from the Manufacturing Supply Chain
Authors: Mani Venkatesh, Catarina Delgado, Purvishkumar Patel
Abstract:
The sustainable supply chain is gaining popularity among practitioners because of increased environmental degradation and stakeholder awareness. On the other hand supply chain, risk management is very crucial for the practitioners as it potentially disrupts supply chain operations. Prediction and addressing the risk caused by social issues in the supply chain is paramount importance to the sustainable enterprise. More recently, the usage of Big data analytics for forecasting business trends has been gaining momentum among professionals. The aim of the research is to explore the application of big data, predictive analytics in successfully mitigating supply chain social risk and demonstrate how such mitigation can help in achieving sustainability (environmental, economic & social). The method involves the identification and validation of social issues in the supply chain by an expert panel and survey. Later, we used a case study to illustrate the application of big data in the successful identification and mitigation of social issues in the supply chain. Our result shows that the company can predict various social issues through big data, predictive analytics and mitigate the social risk. We also discuss the implication of this research to the body of knowledge and practice.Keywords: big data, sustainability, supply chain social sustainability, social risk, case study
Procedia PDF Downloads 40824959 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs
Procedia PDF Downloads 16124958 The Importance of the Fluctuation in Blood Sugar and Blood Pressure of Insulin-Dependent Diabetic Patients with Chronic Kidney Disease
Authors: Hitoshi Minakuchi, Izumi Takei, Shu Wakino, Koichi Hayashi, Hiroshi Itoh
Abstract:
Objectives: Among type 2 diabetics, patients with CKD(chronic kidney disease), insulin resistance, impaired glyconeogenesis in kidney and reduced degradation of insulin are recognized, and we observed different fluctuational patterns of blood sugar between CKD patients and non-CKD patients. On the other hand, non-dipper type blood pressure change is the risk of organ damage and mortality. We performed cross-sectional study to elucidate the characteristic of the fluctuation of blood glucose and blood pressure at insulin-treated diabetic patients with chronic kidney disease. Methods: From March 2011 to April 2013, at the Ichikawa General Hospital of Tokyo Dental College, we recruited 20 outpatients. All participants are insulin-treated type 2 diabetes with CKD. We collected serum samples, urine samples for several hormone measurements, and performed CGMS(Continuous glucose measurement system), ABPM (ambulatory blood pressure monitoring), brain computed tomography, carotid artery thickness, ankle brachial index, PWV, CVR-R, and analyzed these data statistically. Results: Among all 20 participants, hypoglycemia was decided blood glucose 70mg/dl by CGMS of 9 participants (45.0%). The event of hypoglycemia was recognized lower eGFR (29.8±6.2ml/min:41.3±8.5ml/min, P<0.05), lower HbA1c (6.44±0.57%:7.53±0.49%), higher PWV (1858±97.3cm/s:1665±109.2cm/s), higher serum glucagon (194.2±34.8pg/ml:117.0±37.1pg/ml), higher free cortisol of urine (53.8±12.8μg/day:34.8±7.1μg/day), and higher metanephrin of urine (0.162±0.031mg/day:0.076±0.029mg/day). Non-dipper type blood pressure change in ABPM was detected 8 among 9 participants with hypoglycemia (88.9%), 4 among 11 participants (36.4%) without hypoglycemia. Multiplex logistic-regression analysis revealed that the event of hypoglycemia is the independent factor of non-dipper type blood pressure change. Conclusions: Among insulin-treated type 2 diabetic patients with CKD, the events of hypoglycemia were frequently detected, and can associate with the organ derangements through the medium of non-dipper type blood pressure change.Keywords: chronic kidney disease, hypoglycemia, non-dipper type blood pressure change, diabetic patients
Procedia PDF Downloads 41524957 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects
Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim
Abstract:
Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation
Procedia PDF Downloads 4224956 Comprehensive Literature Review of the Humanistic Burden of Clostridium (Clostridiodes) difficile Infection
Authors: Caroline Seo, Jennifer Stephens, Kirstin H. Heinrich
Abstract:
Background: Clostridiodes (formerly Clostridium) difficile infection (CDI) is an anaerobic, spore-forming bacterium with manifestations including diarrhea, pseudomembranous colitis and toxic megacolon. Despite general understanding that CDI may be associated with marked burden on patients’ health, there has been limited information available on the humanistic burden of CDI. The objective of this literature review was to summarize the published data on the humanistic burden of CDI globally, in order to better inform future research efforts and increase awareness of the patient perspective in this disease. Methods: A comprehensive literature review of the past 15 years (2002-2017) was conducted using MEDLINE, Embase and Cumulative Index of Nursing and Allied Health Literature. Additional searches were conducted from conference proceedings (2015-2017). Articles selected were studies specifically designed to examine the humanistic burden of illness associated with adult patients with CDI. Results: Of 3,325 articles or abstracts identified, 33 remained after screening and full text review. Sixty percent (60%) were published in 2016 or 2017. Data from the United States or Western Europe were most common. Data from Brazil, Canada, China and Spain also exist. Thirteen (13) studies used validated patient-reported outcomes instruments, mostly EQ-5D utility and SF-36 generic instruments. Three (3) studies used CDI-specific instruments (CDiff32, CDI-DaySyms). The burden of CDI impacts patients in multiple health-related quality of life (HRQOL) domains. SF-36 domains with the largest decrements compared to other GI diarrheal diseases (IBS-D and Crohn’s) were role physical, physical functioning, vitality, social functioning, and role emotional. Reported EQ-5D utilities for CDI ranged from 0.35-0.42 compared to 0.65 in Crohn’s and 0.72 in IBS-D. The majority of papers addressed physical functioning and mental health domains (67% for both). Across various studies patients reported weakness, lack of appetite, sleep disturbance, functional dependence, and decreased activities of daily lives due to the continuous diarrhea. Due to lack of control over this infection, CDI also impacts the psychological and emotional quality of life of the patients. Patients reported feelings of fear, anxiety, frustration, depression, and embarrassment. Additionally, the type of disease (primary vs. recurrent) may impact mental health. One study indicated that there is a decrement in SF-36 mental scores in patients with recurrent CDI, in comparison to patients with primary CDI. Other domains highlighted by these studies include pain (27%), social isolation (27%), vitality and fatigue (24%), self-care (9%), and caregiver burden (0%). Two studies addressed work productivity, with 1 of these studies reporting that CDI patients had the highest work productivity and activity impairment scores among the gastrointestinal diseases. No study specifically included caregiver self-report. However, 3 studies did provide mention of patients’ worry on how their diagnosis of CDI would impact family, caregivers, and/or friends. Conclusions: Despite being a serious public health issue there has been a paucity of research on the HRQOL among those with CDI. While progress is being made, gaps exist in understanding the burden on patients, caregivers, and families. Future research is warranted to aid understanding of the CDI patient perspective.Keywords: burden, Clostridiodes, difficile, humanistic, infection
Procedia PDF Downloads 13624955 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data
Authors: N. Borjalilu, P. Rabiei, A. Enjoo
Abstract:
Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety
Procedia PDF Downloads 16824954 Preservation of Near-Extinct African Culture: The Case of Yoruba Proverbs
Authors: Makinde David Olajide
Abstract:
Proverb is an important aspect of most indigenous culture in Africa including that of the Yoruba people of southwestern Nigeria. As revealed by recent studies, Yoruba proverbs as an important cultural heritage are threatened and near extinct. This fear of proverb extinct in Yoruba cultural growth has been observed and expressed at different fora by many researchers and professionals including Art historians, culture patrons, social critics’ and teachers among others. Investigation revealed that the intangible nature of proverb is largely responsible for its continuous disappearance in the language structure and creative speeches which give the unique identity to the Yoruba people. Some of the factors that are responsible for culture extinct include: absence of moonlight stories by the elderly, the nuclear family system, and total assimilation of western culture, the concept of modernity and urban nature of Yoruba towns among others. Therefore, to preserve this creative heritage (proverb), there is need for a conscious shift of the traditional role of proverbs in speech development to its use as tool for artistic creations and expressions in visual form. The study was carried out between June, 2013 and February, 2015 in three Yoruba towns; Ilorin, Ede and Ogbomoso selected from Kwara, Osun and Oyo states respectively. The data used in this study were collected through oral and structured interviews. Fifteen interviewers were purposively selected in each of the study areas. It also employs the use of electronic and printed media to generate relevant literature on the subject matter. The study revealed that many Yoruba proverbs are preserved or hidden in text books, monograph, home videos, films and pastoral messages. However, this has not stopped the problem of lack of understanding of its usage, meaning and reasons for its extinction that may hinder its preservation for the incoming generations. This study concludes that indigenous culture can be revived and preserved for future generations when there is a conscious attempt to integrate or convert their traditional roles for present day realities and relevance in our social and educational needs.Keywords: culture, assimilation, extinct, heritage, preservation
Procedia PDF Downloads 33324953 From Modeling of Data Structures towards Automatic Programs Generating
Authors: Valentin P. Velikov
Abstract:
Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.Keywords: computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling
Procedia PDF Downloads 30624952 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features
Authors: Bo Wang
Abstract:
The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection
Procedia PDF Downloads 28424951 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector
Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar
Abstract:
Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake
Procedia PDF Downloads 13824950 Finding the Free Stream Velocity Using Flow Generated Sound
Authors: Saeed Hosseini, Ali Reza Tahavvor
Abstract:
Sound processing is one the subjects that newly attracts a lot of researchers. It is efficient and usually less expensive than other methods. In this paper the flow generated sound is used to estimate the flow speed of free flows. Many sound samples are gathered. After analyzing the data, a parameter named wave power is chosen. For all samples, the wave power is calculated and averaged for each flow speed. A curve is fitted to the averaged data and a correlation between the wave power and flow speed is founded. Test data are used to validate the method and errors for all test data were under 10 percent. The speed of the flow can be estimated by calculating the wave power of the flow generated sound and using the proposed correlation.Keywords: the flow generated sound, free stream, sound processing, speed, wave power
Procedia PDF Downloads 41524949 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves
Authors: Shengnan Chen, Shuhua Wang
Abstract:
Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves
Procedia PDF Downloads 28324948 Efficiency of DMUs in Presence of New Inputs and Outputs in DEA
Authors: Esmat Noroozi, Elahe Sarfi, Farha Hosseinzadeh Lotfi
Abstract:
Examining the impacts of data modification is considered as sensitivity analysis. A lot of studies have considered the data modification of inputs and outputs in DEA. The issues which has not heretofore been considered in DEA sensitivity analysis is modification in the number of inputs and (or) outputs and determining the impacts of this modification in the status of efficiency of DMUs. This paper is going to present systems that show the impacts of adding one or multiple inputs or outputs on the status of efficiency of DMUs and furthermore a model is presented for recognizing the minimum number of inputs and (or) outputs from among specified inputs and outputs which can be added whereas an inefficient DMU will become efficient. Finally the presented systems and model have been utilized for a set of real data and the results have been reported.Keywords: data envelopment analysis, efficiency, sensitivity analysis, input, out put
Procedia PDF Downloads 45024947 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach
Authors: Gong Zhilin, Jing Yang, Jian Yin
Abstract:
The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).Keywords: credit card, data mining, fraud detection, money transactions
Procedia PDF Downloads 13124946 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-time
Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl
Abstract:
In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method Web-App auto-generated twin data structure replication. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi" has been developed. A special login form has been developed with a special instance of data validation; this verification process secures the web application from its early stages. The system has been tested and validated, up to 99% of SQLi attacks have been prevented.Keywords: SQL injection, attacks, web application, accuracy, database
Procedia PDF Downloads 15124945 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications
Procedia PDF Downloads 9324944 Regression for Doubly Inflated Multivariate Poisson Distributions
Authors: Ishapathik Das, Sumen Sen, N. Rao Chaganty, Pooja Sengupta
Abstract:
Dependent multivariate count data occur in several research studies. These data can be modeled by a multivariate Poisson or Negative binomial distribution constructed using copulas. However, when some of the counts are inflated, that is, the number of observations in some cells are much larger than other cells, then the copula based multivariate Poisson (or Negative binomial) distribution may not fit well and it is not an appropriate statistical model for the data. There is a need to modify or adjust the multivariate distribution to account for the inflated frequencies. In this article, we consider the situation where the frequencies of two cells are higher compared to the other cells, and develop a doubly inflated multivariate Poisson distribution function using multivariate Gaussian copula. We also discuss procedures for regression on covariates for the doubly inflated multivariate count data. For illustrating the proposed methodologies, we present a real data containing bivariate count observations with inflations in two cells. Several models and linear predictors with log link functions are considered, and we discuss maximum likelihood estimation to estimate unknown parameters of the models.Keywords: copula, Gaussian copula, multivariate distributions, inflated distributios
Procedia PDF Downloads 15624943 Psychological Intervention for Partners Post-Stroke: A Case Study
Authors: Natasha Yasmin Felles, Gerard Riley
Abstract:
Background and Aims: Relationship breakdown is typical when one partner lives with an acquired brain injury caused by issues like a stroke. Research has found that the perception of relationship satisfaction decreases following such an injury among non-injured partners. Non-injured partners also are found to experience caregiver stress/burden as they immediately have to take the role of a caregiver along with being a partner of the injured. Research has also found that the perception of a continuous relationship, i.e. the perception of the relationship to be essentially the same as it was before the injury, also changes among those caregiving partners. However, there is a lack of available intervention strategies that can help those partners with both individual and relationship difficulties. The aim of this case study was to conduct a pilot test of an intervention aimed to explore whether it is possible to support a partner to experience greater continuity within the relationship poststroke, and what benefits such a change might have. Method: A couple, where one partner experienced an acquired brain injury poststroke were provided with Integrated Behavioural Couples Therapy for 3-months. The intervention addressed goals identified as necessary by the couple and by the formulation of their individual and relationship difficulties, alongside the goal of promoting relationship continuity. Before and after measures were taken using a battery of six questionnaires to evaluate changes in perceptions of continuity, stress, and other aspects of the relationship. Results: Both quantitative and qualitative data showed that relationship continuity was improved after the therapy, as were the measures of stress and other aspects of the relationship. The stress felt by the person with the acquired brain injury also showed some evidence of improvement. Conclusion: The study found that perceptions of relationship continuity can be improved by therapy and that improving these might have a beneficial impact on the stress felt by the carer, their satisfaction with the relationship and overall levels of conflict and closeness within the relationship. The study suggested the value of further research on enhancing perceptions of continuity in the relationship after an acquired brain injury. Currently, the findings of the study have been used to develop a pilot feasibility study to collect substantive evidence on the impact of the intervention on the couples and assess its feasibility and acceptability, which will help in further developing a specific generalized relationship continuity intervention, that will be beneficial in preventing relationship breakdown in the future.Keywords: acquired brain injury, couples therapy, relationship continuity, stroke
Procedia PDF Downloads 12424942 A Case Study of Meningoencephalitis following Le Fort I Osteotomy
Authors: Ryan Goh, Nicholas Beech
Abstract:
Introduction: Le Fort I Osteotomies, although are common procedures in Oral and Maxillofacial Surgery, carry a degree of risk of unfavourable propagation of the down-fracture of the maxilla. This may be the first reported case in the literature for meningoencephalitis to occur following a Le Fort I Osteotomy. Case: A 32-year-old female was brought into the Emergency Department four days after a Le Fort I Osteotomy, with a Glasgow Coma Scale (GCS) of 8 (E3V1M4). A Computed Tomography (CT) Head showed a skull base fracture at the right sphenoid sinus. Lumbar puncture was completed, and Klebsiella oxytoca was found in the Cerebrospinal Fluid (CSF). She was treated with Meropenem, and rapidly improved thereafter. CSF rhinorrhoea was identified when she was extubated, which was successfully managed via a continuous lumbar drain. She was discharged on day 14 without any neurological deficits. Conclusion: The most likely aspect of the Le Fort I Osteotomy to obtain a skull base fracture is during the pterygomaxillary disjunction. Care should always be taken to avoid significant risks of skull base fractures, CSF rhinorrhoea, meningitis and encephalitis.Keywords: meningitis, orthognathic surgery, post-operative complication, skull base, rhinorrhea
Procedia PDF Downloads 12524941 An Exploratory Research of Human Character Analysis Based on Smart Watch Data: Distinguish the Drinking State from Normal State
Authors: Lu Zhao, Yanrong Kang, Lili Guo, Yuan Long, Guidong Xing
Abstract:
Smart watches, as a handy device with rich functionality, has become one of the most popular wearable devices all over the world. Among the various function, the most basic is health monitoring. The monitoring data can be provided as an effective evidence or a clue for the detection of crime cases. For instance, the step counting data can help to determine whether the watch wearer was quiet or moving during the given time period. There is, however, still quite few research on the analysis of human character based on these data. The purpose of this research is to analyze the health monitoring data to distinguish the drinking state from normal state. The analysis result may play a role in cases involving drinking, such as drunk driving. The experiment mainly focused on finding the figures of smart watch health monitoring data that change with drinking and figuring up the change scope. The chosen subjects are mostly in their 20s, each of whom had been wearing the same smart watch for a week. Each subject drank for several times during the week, and noted down the begin and end time point of the drinking. The researcher, then, extracted and analyzed the health monitoring data from the watch. According to the descriptive statistics analysis, it can be found that the heart rate change when drinking. The average heart rate is about 10% higher than normal, the coefficient of variation is less than about 30% of the normal state. Though more research is needed to be carried out, this experiment and analysis provide a thought of the application of the data from smart watches.Keywords: character analysis, descriptive statistics analysis, drink state, heart rate, smart watch
Procedia PDF Downloads 16724940 An Approach to Practical Determination of Fair Premium Rates in Crop Hail Insurance Using Short-Term Insurance Data
Authors: Necati Içer
Abstract:
Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major difficulty in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.Keywords: crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters
Procedia PDF Downloads 5524939 A Multilevel-Synthesis Approach with Reduced Number of Switches for 99-Level Inverter
Authors: P. Satish Kumar, V. Ramu, K. Ramakrishna
Abstract:
In this paper, an efficient multilevel wave form synthesis technique is proposed and applied to a 99-level inverter. The basic principle of the proposed scheme is that the continuous output voltage levels can be synthesized by the addition or subtraction of the instantaneous voltages generated from different voltage levels. This synthesis technique can be realized by an array of switching devices composing full-bridge inverter modules and proper mixing of each bi-directional switch modules. The most different aspect, compared to the conventional approach, in the synthesis of the multilevel output waveform is the utilization of a combination of bidirectional switches and full bridge inverter modules with reduced number of components. A 99-level inverter consists of three full-bridge modules and six bi-directional switch modules. The validity of the proposed scheme is verified by the simulation.Keywords: cascaded connection, multilevel inverter, synthesis, total harmonic distortion
Procedia PDF Downloads 53224938 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa
Authors: Samy A. Khalil, U. Ali Rahoma
Abstract:
The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa
Procedia PDF Downloads 9824937 Algorithm Optimization to Sort in Parallel by Decreasing the Number of the Processors in SIMD (Single Instruction Multiple Data) Systems
Authors: Ali Hosseini
Abstract:
Paralleling is a mechanism to decrease the time necessary to execute the programs. Sorting is one of the important operations to be used in different systems in a way that the proper function of many algorithms and operations depend on sorted data. CRCW_SORT algorithm executes ‘N’ elements sorting in O(1) time on SIMD (Single Instruction Multiple Data) computers with n^2/2-n/2 number of processors. In this article having presented a mechanism by dividing the input string by the hinge element into two less strings the number of the processors to be used in sorting ‘N’ elements in O(1) time has decreased to n^2/8-n/4 in the best state; by this mechanism the best state is when the hinge element is the middle one and the worst state is when it is minimum. The findings from assessing the proposed algorithm by other methods on data collection and number of the processors indicate that the proposed algorithm uses less processors to sort during execution than other methods.Keywords: CRCW, SIMD (Single Instruction Multiple Data) computers, parallel computers, number of the processors
Procedia PDF Downloads 31024936 Increasing the System Availability of Data Centers by Using Virtualization Technologies
Authors: Chris Ewe, Naoum Jamous, Holger Schrödl
Abstract:
Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization
Procedia PDF Downloads 53624935 Measurement Tools of the Maturity Model for IT Service Outsourcing in Higher Education Institutions
Authors: Victoriano Valencia García, Luis Usero Aragonés, Eugenio J. Fernández Vicente
Abstract:
Nowadays, the successful implementation of ICTs is vital for almost any kind of organization. Good governance and ICT management are essential for delivering value, managing technological risks, managing resources and performance measurement. In addition, outsourcing is a strategic IT service solution which complements IT services provided internally in organizations. This paper proposes the measurement tools of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. These measurement tools allow independent validation and practical application in the field of higher education, using a questionnaire, metrics tables, and continuous improvement plan tables as part of the measurement process. Guidelines and standards are proposed in the model for facilitating adaptation to universities and achieving excellence in the outsourcing of IT services.Keywords: IT governance, IT management, IT services, outsourcing, maturity model, measurement tools
Procedia PDF Downloads 592