Search results for: 99.95% IoT data transmission savings
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26882

Search results for: 99.95% IoT data transmission savings

24692 Finding the Free Stream Velocity Using Flow Generated Sound

Authors: Saeed Hosseini, Ali Reza Tahavvor

Abstract:

Sound processing is one the subjects that newly attracts a lot of researchers. It is efficient and usually less expensive than other methods. In this paper the flow generated sound is used to estimate the flow speed of free flows. Many sound samples are gathered. After analyzing the data, a parameter named wave power is chosen. For all samples, the wave power is calculated and averaged for each flow speed. A curve is fitted to the averaged data and a correlation between the wave power and flow speed is founded. Test data are used to validate the method and errors for all test data were under 10 percent. The speed of the flow can be estimated by calculating the wave power of the flow generated sound and using the proposed correlation.

Keywords: the flow generated sound, free stream, sound processing, speed, wave power

Procedia PDF Downloads 415
24691 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves

Authors: Shengnan Chen, Shuhua Wang

Abstract:

Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.

Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves

Procedia PDF Downloads 283
24690 Measure the Gas to Dust Ratio Towards Bright Sources in the Galactic Bulge

Authors: Jun Yang, Norbert Schulz, Claude Canizares

Abstract:

Knowing the dust content in the interstellar matter is necessary to understand the composition and evolution of the interstellar medium (ISM). The metal composition of the ISM enables us to study the cooling and heating processes that dominate the star formation rates in our Galaxy. The Chandra High Energy Transmission Grating (HETG) Spectrometer provides a unique opportunity to measure element dust compositions through X-ray edge absorption structure. We measure gas to dust optical depth ratios towards 9 bright Low-Mass X-ray Binaries (LMXBs) in the Galactic Bulge with the highest precision so far. Well calibrated and pile-up free optical depths are measured with the HETG spectrometer with respect to broadband hydrogen equivalent absorption in bright LMXBs: 4U 1636-53, Ser X-1, GX 3+1, 4U 1728-34, 4U 1705-44, GX 340+0, GX 13+1, GX 5-1, and GX 349+2. From the optical depths results, we deduce gas to dust ratios for various silicates in the ISM and present our results for the Si K edge in different lines of sight towards the Galactic Bulge.

Keywords: low-mass X-ray binaries, interstellar medium, gas to dust ratio, spectrometer

Procedia PDF Downloads 143
24689 Efficiency of DMUs in Presence of New Inputs and Outputs in DEA

Authors: Esmat Noroozi, Elahe Sarfi, Farha Hosseinzadeh Lotfi

Abstract:

Examining the impacts of data modification is considered as sensitivity analysis. A lot of studies have considered the data modification of inputs and outputs in DEA. The issues which has not heretofore been considered in DEA sensitivity analysis is modification in the number of inputs and (or) outputs and determining the impacts of this modification in the status of efficiency of DMUs. This paper is going to present systems that show the impacts of adding one or multiple inputs or outputs on the status of efficiency of DMUs and furthermore a model is presented for recognizing the minimum number of inputs and (or) outputs from among specified inputs and outputs which can be added whereas an inefficient DMU will become efficient. Finally the presented systems and model have been utilized for a set of real data and the results have been reported.

Keywords: data envelopment analysis, efficiency, sensitivity analysis, input, out put

Procedia PDF Downloads 450
24688 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach

Authors: Gong Zhilin, Jing Yang, Jian Yin

Abstract:

The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).

Keywords: credit card, data mining, fraud detection, money transactions

Procedia PDF Downloads 131
24687 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-time

Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl

Abstract:

In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method Web-App auto-generated twin data structure replication. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi" has been developed. A special login form has been developed with a special instance of data validation; this verification process secures the web application from its early stages. The system has been tested and validated, up to 99% of SQLi attacks have been prevented.

Keywords: SQL injection, attacks, web application, accuracy, database

Procedia PDF Downloads 151
24686 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.

Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications

Procedia PDF Downloads 93
24685 Regression for Doubly Inflated Multivariate Poisson Distributions

Authors: Ishapathik Das, Sumen Sen, N. Rao Chaganty, Pooja Sengupta

Abstract:

Dependent multivariate count data occur in several research studies. These data can be modeled by a multivariate Poisson or Negative binomial distribution constructed using copulas. However, when some of the counts are inflated, that is, the number of observations in some cells are much larger than other cells, then the copula based multivariate Poisson (or Negative binomial) distribution may not fit well and it is not an appropriate statistical model for the data. There is a need to modify or adjust the multivariate distribution to account for the inflated frequencies. In this article, we consider the situation where the frequencies of two cells are higher compared to the other cells, and develop a doubly inflated multivariate Poisson distribution function using multivariate Gaussian copula. We also discuss procedures for regression on covariates for the doubly inflated multivariate count data. For illustrating the proposed methodologies, we present a real data containing bivariate count observations with inflations in two cells. Several models and linear predictors with log link functions are considered, and we discuss maximum likelihood estimation to estimate unknown parameters of the models.

Keywords: copula, Gaussian copula, multivariate distributions, inflated distributios

Procedia PDF Downloads 156
24684 An Exploratory Research of Human Character Analysis Based on Smart Watch Data: Distinguish the Drinking State from Normal State

Authors: Lu Zhao, Yanrong Kang, Lili Guo, Yuan Long, Guidong Xing

Abstract:

Smart watches, as a handy device with rich functionality, has become one of the most popular wearable devices all over the world. Among the various function, the most basic is health monitoring. The monitoring data can be provided as an effective evidence or a clue for the detection of crime cases. For instance, the step counting data can help to determine whether the watch wearer was quiet or moving during the given time period. There is, however, still quite few research on the analysis of human character based on these data. The purpose of this research is to analyze the health monitoring data to distinguish the drinking state from normal state. The analysis result may play a role in cases involving drinking, such as drunk driving. The experiment mainly focused on finding the figures of smart watch health monitoring data that change with drinking and figuring up the change scope. The chosen subjects are mostly in their 20s, each of whom had been wearing the same smart watch for a week. Each subject drank for several times during the week, and noted down the begin and end time point of the drinking. The researcher, then, extracted and analyzed the health monitoring data from the watch. According to the descriptive statistics analysis, it can be found that the heart rate change when drinking. The average heart rate is about 10% higher than normal, the coefficient of variation is less than about 30% of the normal state. Though more research is needed to be carried out, this experiment and analysis provide a thought of the application of the data from smart watches.

Keywords: character analysis, descriptive statistics analysis, drink state, heart rate, smart watch

Procedia PDF Downloads 167
24683 The Effect of TiO₂ Nano-Thin Films on Light Transmission and Self-Cleaning Capabilities of Glass Surface

Authors: Ahmad Alduweesh

Abstract:

Self-cleaning surfaces have become essential in various applications. For instance, in photovoltaics, they provide an easy-cost effecting way to keep the solar cells clean. Titanium dioxide (TiO₂) nanoparticles were fabricated at different thicknesses to study the effect of different thicknesses on the hydrophilicity behavior of TiO₂, eventually leading to customizing hydrophilicity levels to desired values under natural light. As a result, a remarkable increase was noticed in surface hydrophilicity after applying thermal annealing on the as-deposited TiO₂ thin-films, with contact angle dropping from around 85.4ᵒ for as-deposited thin-films down to 5.1ᵒ for one of the annealed samples. The produced thin films were exposed to the outside environment to observe the effect of dust. The transmittance of light using UV-VIS spectroscopy will be conducted on the lowest and highest thicknesses (5-40 nm); this will show whether the Titania has successfully enabled more sunlight to penetrate the glass or not. Surface characterizations, including AFM and contact angle, have been included in this test.

Keywords: physical vapor deposition, TiO₂, nano-thin films, hydrophobicity, hydrophilicity, self-cleaning surfaces

Procedia PDF Downloads 114
24682 The Creative Unfolding of “Reduced Descriptive Structures” in Musical Cognition: Technical and Theoretical Insights Based on the OpenMusic and PWGL Long-Term Feedback

Authors: Jacopo Baboni Schilingi

Abstract:

We here describe the theoretical and philosophical understanding of a long term use and development of algorithmic computer-based tools applied to music composition. The findings of our research lead us to interrogate some specific processes and systems of communication engaged in the discovery of specific cultural artworks: artistic creation in the sono-musical domain. Our hypothesis is that the patterns of auditory learning cannot be only understood in terms of social transmission but would gain to be questioned in the way they rely on various ranges of acoustic stimuli modes of consciousness and how the different types of memories engaged in the percept-action expressive systems of our cultural communities also relies on these shadowy conscious entities we named “Reduced Descriptive Structures”.

Keywords: algorithmic sonic computation, corrected and self-correcting learning patterns in acoustic perception, morphological derivations in sensorial patterns, social unconscious modes of communication

Procedia PDF Downloads 155
24681 An Approach to Practical Determination of Fair Premium Rates in Crop Hail Insurance Using Short-Term Insurance Data

Authors: Necati Içer

Abstract:

Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major difficulty in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.

Keywords: crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters

Procedia PDF Downloads 55
24680 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa

Authors: Samy A. Khalil, U. Ali Rahoma

Abstract:

The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.

Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa

Procedia PDF Downloads 98
24679 A Reconfigurable Microstrip Patch Antenna with Polyphase Filter for Polarization Diversity and Cross Polarization Filtering Operation

Authors: Lakhdar Zaid, Albane Sangiovanni

Abstract:

A reconfigurable microstrip patch antenna with polyphase filter for polarization diversity and cross polarization filtering operation is presented in this paper. In our approach, a polyphase filter is used to obtain the four 90° phase shift outputs to feed a square microstrip patch antenna. The antenna can be switched between four states of polarization in transmission as well as in receiving mode. Switches are interconnected with the polyphase filter network to produce left-hand circular polarization, right-hand circular polarization, horizontal linear polarization, and vertical linear polarization. Additional advantage of using polyphase filter is its filtering capability for cross polarization filtering in right-hand circular polarization and left-hand circular polarization operation. The theoretical and simulated results demonstrated that polyphase filter is a good candidate to drive microstrip patch antenna to accomplish polarization diversity and cross polarization filtering operation.

Keywords: active antenna, polarization diversity, patch antenna, polyphase filter

Procedia PDF Downloads 411
24678 Algorithm Optimization to Sort in Parallel by Decreasing the Number of the Processors in SIMD (Single Instruction Multiple Data) Systems

Authors: Ali Hosseini

Abstract:

Paralleling is a mechanism to decrease the time necessary to execute the programs. Sorting is one of the important operations to be used in different systems in a way that the proper function of many algorithms and operations depend on sorted data. CRCW_SORT algorithm executes ‘N’ elements sorting in O(1) time on SIMD (Single Instruction Multiple Data) computers with n^2/2-n/2 number of processors. In this article having presented a mechanism by dividing the input string by the hinge element into two less strings the number of the processors to be used in sorting ‘N’ elements in O(1) time has decreased to n^2/8-n/4 in the best state; by this mechanism the best state is when the hinge element is the middle one and the worst state is when it is minimum. The findings from assessing the proposed algorithm by other methods on data collection and number of the processors indicate that the proposed algorithm uses less processors to sort during execution than other methods.

Keywords: CRCW, SIMD (Single Instruction Multiple Data) computers, parallel computers, number of the processors

Procedia PDF Downloads 310
24677 Increasing the System Availability of Data Centers by Using Virtualization Technologies

Authors: Chris Ewe, Naoum Jamous, Holger Schrödl

Abstract:

Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.

Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization

Procedia PDF Downloads 537
24676 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt

Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer

Abstract:

Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.

Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening

Procedia PDF Downloads 53
24675 Enhanced Dielectric Properties of La Substituted CoFe2O4 Magnetic Nanoparticles

Authors: M. Vadivel, R. Ramesh Babu

Abstract:

Spinel ferrite magnetic nanomaterials have received a great deal of attention in recent years due to their wide range of potential applications in various fields such as magnetic data storage and microwave device applications. Among the family of spinel ferrites, cobalt ferrite (CoFe2O4) has been widely used in the field of high-frequency applications because of its remarkable material qualities such as moderate saturation magnetization, high coercivity, large permeability at higher frequency and high electrical resistivity. For aforementioned applications, the materials should have an improved electrical property, especially enhancement in the dielectric properties. It is well known that the substitution of rare earth metal cations in Fe3+ site of CoFe2O4 nanoparticles leads to structural distortion and thus significantly influences the structural and morphological properties whereas greatly modifies the electrical and magnetic properties of a material. In the present investigation, we report on the influence of lanthanum (La3+) ion substitution on the structural, morphological, dielectric and magnetic properties of CoFe2O4 magnetic nanoparticles prepared by co-precipitation method. Powder X-ray diffraction patterns reveal the formation of inverse cubic spinel structure with the signature of LaFeO3 phase at higher La3+ ion concentrations. Raman and Fourier transform infrared spectral analysis also confirms the formation of inverse cubic spinel structure and Fe-O symmetrical stretching vibrations of CoFe2O4 nanoparticles, respectively. Transmission electron microscopy study reveals that the size of the particles gradually increases with increasing La3+ ion concentrations whereas the agglomeration gets slightly reduced for La3+ ion substituted CoFe2O4 nanoparticles than that of undoped CoFe2O4 nanoparticles. Dielectric properties such as dielectric constant and dielectric loss were recorded as a function of frequency and temperature which reveals that the dielectric constant gradually increases with increasing temperatures as well as La3+ ion concentrations. The increased dielectric constant might be the reason that the formation of LaFeO3 secondary phase at higher La3+ ion concentrations. Magnetic measurement demonstrates that the saturation magnetization gradually decreases from 61.45 to 25.13 emu/g with increasing La3+ ion concentrations which is due to the nonmagnetic nature of La3+ ions substitution.

Keywords: cobalt ferrite, co-precipitation, dielectric properties, saturation magnetization

Procedia PDF Downloads 317
24674 Review of Different Machine Learning Algorithms

Authors: Syed Romat Ali Shah, Bilal Shoaib, Saleem Akhtar, Munib Ahmad, Shahan Sadiqui

Abstract:

Classification is a data mining technique, which is recognizedon Machine Learning (ML) algorithm. It is used to classifythe individual articlein a knownofinformation into a set of predefinemodules or group. Web mining is also a portion of that sympathetic of data mining methods. The main purpose of this paper to analysis and compare the performance of Naïve Bayse Algorithm, Decision Tree, K-Nearest Neighbor (KNN), Artificial Neural Network (ANN)and Support Vector Machine (SVM). This paper consists of different ML algorithm and their advantages and disadvantages and also define research issues.

Keywords: Data Mining, Web Mining, classification, ML Algorithms

Procedia PDF Downloads 303
24673 Using Genetic Algorithms and Rough Set Based Fuzzy K-Modes to Improve Centroid Model Clustering Performance on Categorical Data

Authors: Rishabh Srivastav, Divyam Sharma

Abstract:

We propose an algorithm to cluster categorical data named as ‘Genetic algorithm initialized rough set based fuzzy K-Modes for categorical data’. We propose an amalgamation of the simple K-modes algorithm, the Rough and Fuzzy set based K-modes and the Genetic Algorithm to form a new algorithm,which we hypothesise, will provide better Centroid Model clustering results, than existing standard algorithms. In the proposed algorithm, the initialization and updation of modes is done by the use of genetic algorithms while the membership values are calculated using the rough set and fuzzy logic.

Keywords: categorical data, fuzzy logic, genetic algorithm, K modes clustering, rough sets

Procedia PDF Downloads 247
24672 Forecasting Amman Stock Market Data Using a Hybrid Method

Authors: Ahmad Awajan, Sadam Al Wadi

Abstract:

In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.

Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series

Procedia PDF Downloads 129
24671 BER of the Leaky Feeder under Rayleigh Fading Multichannel Reception with Imperfect Phase Estimation

Authors: Hasan Farahneh, Xavier Fernando

Abstract:

Leaky Feeder (LF) has been a proven technology for many decades and its promises broadband wireless access in short range but being overlooked until now. The LF is a natural MIMO transceiver ideal for micro and pico cells. In this work, the LF is considered as a linear antenna array MultiInput-Single-Output (MISO) and derive the average bit error rate (BER) in Rayleigh fading channel considering ideal and independent paths (iid) which consider there is no correlation and mutual coupling between transmit antennas (slots) or receiver antenna considering QPSK modulation with imperfect phase estimation. We consider maximal ratio transmission (MRT) at the transmit end and maximal ratio combining (MRC) at the receiving end. Analytical expressions are derived for the BER with radiating cable transmitters. The effects of slot spacing and carrier frequency on the BER are also studied. Numerical evaluations show the radiating cable transmitter offer much lower BER than a single antenna transmitter with same SNR.

Keywords: leaky feeder, BER, QPSK, rayleigh fading, channel gain, phase mismatch

Procedia PDF Downloads 381
24670 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems

Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell

Abstract:

Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.

Keywords: building information modeling, BIM, facilities management systems, interoperability, information management

Procedia PDF Downloads 116
24669 Investigating Cloud Forensics: Challenges, Tools, and Practical Case Studies

Authors: Noha Badkook, Maryam Alsubaie, Samaher Dawood, Enas Khairallah

Abstract:

Cloud computing has introduced transformative benefits in data storage and accessibility while posing unique forensic challenges. This paper explores cloud forensics, focusing on investigating and analyzing evidence from cloud environments to address issues such as unauthorized data access, manipulation, and breaches. The research highlights the practical use of opensource forensic tools like Autopsy and Bulk Extractor in realworld scenarios, including unauthorized data sharing via Google Drive and the misuse of personal cloud storage for sensitive information leaks. This work underscores the growing importance of robust forensic procedures and accessible tools in ensuring data security and accountability in cloud ecosystems.

Keywords: cloud forensic, tools, challenge, autopsy, bulk extractor

Procedia PDF Downloads 3
24668 Software-Defined Radio Based Channel Measurement System of Wideband HF Communication System in Low-Latitude Region

Authors: P. H. Mukti, I. Kurniawati, F. Oktaviansyah, A. D. Adhitya, N. Rachmadani, R. Corputty, G. Hendrantoro, T. Fukusako

Abstract:

HF Communication system is one of the attractive fields among many researchers since it can be reached long-distance areas with low-cost. This long-distance communication can be achieved by exploiting the ionosphere as a transmission medium for the HF radio wave. However, due to the dynamic nature of ionosphere, the channel characteristic of HF communication has to be investigated in order to gives better performances. Many techniques to characterize HF channel are available in the literature. However, none of those techniques describe the HF channel characteristic in low-latitude regions, especially equatorial areas. Since the ionosphere around equatorial region has an ESF phenomenon, it becomes an important investigation to characterize the wideband HF Channel in low-latitude region. On the other sides, the appearance of software-defined radio attracts the interest of many researchers. Accordingly, in this paper a SDR-based channel measurement system is proposed to be used for characterizing the HF channel in low-latitude region.

Keywords: channel characteristic, HF communication system, LabVIEW, software-defined radio, universal software radio peripheral

Procedia PDF Downloads 488
24667 A Statistical Energy Analysis Model of an Automobile for the Prediction of the Internal Sound Pressure Level

Authors: El Korchi Ayoub, Cherif Raef

Abstract:

Interior noise in vehicles is an essential factor affecting occupant comfort. Over recent decades, much work has been done to develop simulation tools for vehicle NVH. At the medium high-frequency range, the statistical energy analysis method (SEA) shows significant effectiveness in predicting noise and vibration responses of mechanical systems. In this paper, the evaluation of the sound pressure level (SPL) inside an automobile cabin has been performed numerically using the statistical energy analysis (SEA) method. A test car cabin was performed using a monopole source as a sound source. The decay rate method was employed to obtain the damping loss factor (DLF) of each subsystem of the developed SEA model. These parameters were then used to predict the sound pressure level in the interior cabin. The results show satisfactory agreement with the directly measured SPL. The developed SEA vehicle model can be used in early design phases and allows the engineer to identify sources contributing to the total noise and transmission paths.

Keywords: SEA, SPL, DLF, NVH

Procedia PDF Downloads 91
24666 Preventative Maintenance, Impact on the Optimal Replacement Strategy of Secondhand Products

Authors: Pin-Wei Chiang, Wen-Liang Chang, Ruey-Huei Yeh

Abstract:

This paper investigates optimal replacement and preventative maintenance policies of secondhand products under a Finite Planning Horizon (FPH). Any consumer wishing to replace their product under FPH would have it undergo minimal repairs. The replacement provided would be required to undergo periodical preventive maintenance done to avoid product failure. Then, a mathematical formula for disbursement cost for products under FPH can be derived. Optimal policies are then obtained to minimize cost. In the first of two segments of the paper, a model for initial product purchase of either new or secondhand products is used. This model is built by analyzing product purchasing price, surplus value of product, as well as the minimal repair cost. The second segment uses a model for replacement products, which are also secondhand products with no limit on usage. This model analyzes the same components as the first as well as expected preventative maintenance cost. Using these two models, a formula for the expected final total cost can be developed. The formula requires four variables (optimal preventive maintenance level, preventive maintenance frequency, replacement timing, age of replacement product) to find minimal cost requirement. Based on analysis of the variables using the expected total final cost model, it was found that the purchasing price and length of ownership were directly related. Also, consumers should choose the secondhand product with the higher usage for replacement. Products with higher initial usage upon acquisition require an earlier replacement schedule. In this case, replacements should be made with a secondhand product with less usage. In addition, preventative maintenance also significantly reduces cost. Consumers that plan to use products for longer periods of time replace their products later. Hence these consumers should choose the secondhand product with lesser initial usage for replacement. Preventative maintenance also creates significant total cost savings in this case. This study provides consumers with a method of calculating both the ideal amount of usage of the products they should purchase as well as the frequency and level of preventative maintenance that should be conducted in order to minimize cost and maintain product function.

Keywords: finite planning horizon, second hand product, replacement, preventive maintenance, minimal repair

Procedia PDF Downloads 473
24665 Data Security and Privacy Challenges in Cloud Computing

Authors: Amir Rashid

Abstract:

Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.

Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud

Procedia PDF Downloads 299
24664 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 86
24663 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 184