Search results for: data comparison
26547 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves
Authors: Shengnan Chen, Shuhua Wang
Abstract:
Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves
Procedia PDF Downloads 28326546 Efficiency of DMUs in Presence of New Inputs and Outputs in DEA
Authors: Esmat Noroozi, Elahe Sarfi, Farha Hosseinzadeh Lotfi
Abstract:
Examining the impacts of data modification is considered as sensitivity analysis. A lot of studies have considered the data modification of inputs and outputs in DEA. The issues which has not heretofore been considered in DEA sensitivity analysis is modification in the number of inputs and (or) outputs and determining the impacts of this modification in the status of efficiency of DMUs. This paper is going to present systems that show the impacts of adding one or multiple inputs or outputs on the status of efficiency of DMUs and furthermore a model is presented for recognizing the minimum number of inputs and (or) outputs from among specified inputs and outputs which can be added whereas an inefficient DMU will become efficient. Finally the presented systems and model have been utilized for a set of real data and the results have been reported.Keywords: data envelopment analysis, efficiency, sensitivity analysis, input, out put
Procedia PDF Downloads 45026545 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach
Authors: Gong Zhilin, Jing Yang, Jian Yin
Abstract:
The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).Keywords: credit card, data mining, fraud detection, money transactions
Procedia PDF Downloads 13126544 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-time
Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl
Abstract:
In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method Web-App auto-generated twin data structure replication. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi" has been developed. A special login form has been developed with a special instance of data validation; this verification process secures the web application from its early stages. The system has been tested and validated, up to 99% of SQLi attacks have been prevented.Keywords: SQL injection, attacks, web application, accuracy, database
Procedia PDF Downloads 15126543 Comparison of the Effect of Semi-Rigid Ankle Bracing Performance among Ankle Injured Versus Non-Injured Adolescent Female Hockey Players
Authors: T. J. Ellapen, N. Acampora, S. Dawson, J. Arling, C. Van Niekerk, H. J. Van Heerden
Abstract:
Objectives: To determine the comparative proprioceptive performance of injured versus non-injured adolescent female hockey players when wearing an ankle brace. Methods: Data were collected from 100 high school players who belonged to the Highway Secondary School KZN Hockey league via voluntary parental informed consent and player assent. Players completed an injury questionnaire probing the prevalence and nature of hockey injuries (March-August 2013). Subsequently players completed a Biodex proprioceptive test with and without an ankle brace. Probability was set at p≤ 0.05. Results: Twenty-two players sustained ankle injuries within the six months (p<0.001). Injured players performed similarly without bracing Right Anterior Posterior Index (RAPI): 2.8±0.9; Right Medial Lateral Index (RMLI): 1.9±0.7; Left Anterior Posterior Index (LAPI) LAPI: 2.7; Left Medial Lateral Index (LMLI): 1.7±0.6) as compared to bracing (RAPI: 2.7±1.4; RMLI: 1.8±0.6; LAPI: 2.6±1.0; LMLI: 1.5±0.6) (p>0.05). However, bracing (RAPI: 2.2±0.8; RMLI: 1.5±0.5; LAPI: 2.4±0.9; MLI: 1.5±0.5) improved the ankle stability of the non-injured group as compared to their unbraced performance (RAPI: 2.5±1.0; RMLI: 1.8±0.8; LAPI: 2.8±1.1; LMLI: 1.8±0.6) (p<0.05). Conclusion: Ankle bracing did not enhance the stability of injured ankles. However ankle bracing has an ergogenic effect enhancing the stability of healthy ankles.Keywords: hockey, proprioception, ankle, bracing
Procedia PDF Downloads 34926542 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications
Procedia PDF Downloads 9326541 Regression for Doubly Inflated Multivariate Poisson Distributions
Authors: Ishapathik Das, Sumen Sen, N. Rao Chaganty, Pooja Sengupta
Abstract:
Dependent multivariate count data occur in several research studies. These data can be modeled by a multivariate Poisson or Negative binomial distribution constructed using copulas. However, when some of the counts are inflated, that is, the number of observations in some cells are much larger than other cells, then the copula based multivariate Poisson (or Negative binomial) distribution may not fit well and it is not an appropriate statistical model for the data. There is a need to modify or adjust the multivariate distribution to account for the inflated frequencies. In this article, we consider the situation where the frequencies of two cells are higher compared to the other cells, and develop a doubly inflated multivariate Poisson distribution function using multivariate Gaussian copula. We also discuss procedures for regression on covariates for the doubly inflated multivariate count data. For illustrating the proposed methodologies, we present a real data containing bivariate count observations with inflations in two cells. Several models and linear predictors with log link functions are considered, and we discuss maximum likelihood estimation to estimate unknown parameters of the models.Keywords: copula, Gaussian copula, multivariate distributions, inflated distributios
Procedia PDF Downloads 15626540 Mathematics Professional Development: Uptake and Impacts on Classroom Practice
Authors: Karen Koellner, Nanette Seago, Jennifer Jacobs, Helen Garnier
Abstract:
Although studies of teacher professional development (PD) are prevalent, surprisingly most have only produced incremental shifts in teachers’ learning and their impact on students. There is a critical need to understand what teachers take up and use in their classroom practice after attending PD and why we often do not see greater changes in learning and practice. This paper is based on a mixed methods efficacy study of the Learning and Teaching Geometry (LTG) video-based mathematics professional development materials. The extent to which the materials produce a beneficial impact on teachers’ mathematics knowledge, classroom practices, and their students’ knowledge in the domain of geometry through a group-randomized experimental design are considered. Included is a close-up examination of a small group of teachers to better understand their interpretations of the workshops and their classroom uptake. The participants included 103 secondary mathematics teachers serving grades 6-12 from two US states in different regions. Randomization was conducted at the school level, with 23 schools and 49 teachers assigned to the treatment group and 18 schools and 54 teachers assigned to the comparison group. The case study examination included twelve treatment teachers. PD workshops for treatment teachers began in Summer 2016. Nine full days of professional development were offered to teachers, beginning with the one-week institute (Summer 2016) and four days of PD throughout the academic year. The same facilitator-led all of the workshops, after completing a facilitator preparation process that included a multi-faceted assessment of fidelity. The overall impact of the LTG PD program was assessed from multiple sources: two teacher content assessments, two PD embedded assessments, pre-post-post videotaped classroom observations, and student assessments. Additional data were collected from the case study teachers including additional videotaped classroom observations and interviews. Repeated measures ANOVA analyses were used to detect patterns of change in the treatment teachers’ content knowledge before and after completion of the LTG PD, relative to the comparison group. No significant effects were found across the two groups of teachers on the two teacher content assessments. Teachers were rated on the quality of their mathematics instruction captured in videotaped classroom observations using the Math in Common Observation Protocol. On average, teachers who attended the LTG PD intervention improved their ability to engage students in mathematical reasoning and to provide accurate, coherent, and well-justified mathematical content. In addition, the LTG PD intervention and instruction that engaged students in mathematical practices both positively and significantly predicted greater student knowledge gains. Teacher knowledge was not a significant predictor. Twelve treatment teachers self-selected to serve as case study teachers to provide additional videotapes in which they felt they were using something from the PD they learned and experienced. Project staff analyzed the videos, compared them to previous videos and interviewed the teachers regarding their uptake of the PD related to content knowledge, pedagogical knowledge and resources used. The full paper will include the case study of Ana to illustrate the factors involved in what teachers take up and use from participating in the LTG PD.Keywords: geometry, mathematics professional development, pedagogical content knowledge, teacher learning
Procedia PDF Downloads 12526539 An Exploratory Research of Human Character Analysis Based on Smart Watch Data: Distinguish the Drinking State from Normal State
Authors: Lu Zhao, Yanrong Kang, Lili Guo, Yuan Long, Guidong Xing
Abstract:
Smart watches, as a handy device with rich functionality, has become one of the most popular wearable devices all over the world. Among the various function, the most basic is health monitoring. The monitoring data can be provided as an effective evidence or a clue for the detection of crime cases. For instance, the step counting data can help to determine whether the watch wearer was quiet or moving during the given time period. There is, however, still quite few research on the analysis of human character based on these data. The purpose of this research is to analyze the health monitoring data to distinguish the drinking state from normal state. The analysis result may play a role in cases involving drinking, such as drunk driving. The experiment mainly focused on finding the figures of smart watch health monitoring data that change with drinking and figuring up the change scope. The chosen subjects are mostly in their 20s, each of whom had been wearing the same smart watch for a week. Each subject drank for several times during the week, and noted down the begin and end time point of the drinking. The researcher, then, extracted and analyzed the health monitoring data from the watch. According to the descriptive statistics analysis, it can be found that the heart rate change when drinking. The average heart rate is about 10% higher than normal, the coefficient of variation is less than about 30% of the normal state. Though more research is needed to be carried out, this experiment and analysis provide a thought of the application of the data from smart watches.Keywords: character analysis, descriptive statistics analysis, drink state, heart rate, smart watch
Procedia PDF Downloads 16726538 Propeller Performance Modeling through a Computational Fluid Dynamics Analysis Method
Authors: Maxime Alex Junior Kuitche, Ruxandra Mihaela Botez, Jean-Chirstophe Maunand
Abstract:
The evolution of aircraft is closely linked to the study and improvement of propulsion systems. Determining the propulsion performance is a real challenge in aircraft modeling and design. In addition to theoretical methodologies, experimental procedures are used to obtain a good estimation of the propulsion performances. For piston-propeller propulsion, the propeller needs several experimental tests which could be extremely demanding in terms of time and money. This paper presents a new procedure to estimate the performance of a propeller from a numerical approach using computational fluid dynamic analysis. The propeller was initially scanned, and then, its 3D model was represented using CATIA. A structured meshing and Shear Stress Transition k-ω turbulence model were applied to describe accurately the flow pattern around the propeller. Thus, the Partial Differential Equations were solved using ANSYS FLUENT software. The method was applied on the UAS-S45’s propeller designed and manufactured by Hydra Technologies in Mexico. An extensive investigation was performed for several flight conditions in terms of altitudes and airspeeds with the aim to determine thrust coefficients, power coefficients and efficiency of the propeller. The Computational Fluid Dynamics results were compared with experimental data acquired from wind tunnel tests performed at the LARCASE Price-Paidoussis wind tunnel. The results of this comparison have demonstrated that our approach was highly accurate.Keywords: CFD analysis, propeller performance, unmanned aerial system propeller, UAS-S45
Procedia PDF Downloads 35326537 Islamic Financial Instrument, Standard Parallel Salam as an Alternative to Conventional Derivatives
Authors: Alireza Naserpoor
Abstract:
Derivatives are the most important innovation which has happened in the past decades. When it comes to financial markets, it has changed the whole way of operations of stock, commodities and currency market. Beside a lot of advantages, Conventional derivatives contracts have some disadvantages too. Some problems have been caused by derivatives contain raising Volatility, increasing Bankruptcies and causing financial crises. Standard Parallel Salam contract as an Islamic financial product meanwhile is a financing instrument can be used for risk management by investors. Standard Parallel Salam is a Shari’ah-Compliant contract. Furthermore, it is an alternative to conventional derivatives. Despite the fact that the unstructured types of that, has been used in several Islamic countries, This contract as a structured and standard financial instrument introduced in Iran Mercantile Exchange in 2014. In this paper after introducing parallel Salam, we intend to examine a collection of international experience and local measure regarding launching standard parallel Salam contract and proceed to describe standard scenarios for trading this instrument and practical experience in Iran Mercantile Exchange about this instrument. Afterwards, we make a comparison between SPS and Futures contracts as a conventional derivative. Standard parallel salam contract as an Islamic financial product, can be used for risk management by investors. SPS is a Shariah-Compliant contract. Furthermore it is an alternative to conventional derivatives. This contract as a structured and standard financial instrument introduced in Iran Mercantile Exchange in 2014. despite the fact that the unstructured types of that, has been used in several Islamic countries. In this article after introducing parallel salam, we intend to examine a collection of international experience and local measure regarding launching standard parallel salam contract and proceed to describe standard scenarios for trading this instrument containing two main approaches in SPS using, And practical experience in IME about this instrument Afterwards, a comparison between SPS and Futures contracts as a conventional derivatives.Keywords: futures contracts, hedging, shari’ah compliant instruments, standard parallel salam
Procedia PDF Downloads 39226536 Synthesis, Structural and Vibrational Studies of a New Lacunar Apatite: LIPB2CA2(PO4)3
Authors: A. Chari, A. El Bouari, B. Orayech, A. Faik, J. M. Igartua
Abstract:
The phosphate is a natural resource of great importance in Morocco. In order to exploit this wealth, synthesis and studies of new a material based phosphate, were carried out. The apatite structure present o lot of characteristics, One of the main characteristics is to allow large and various substitutions for both cations and anions. Beside their biological importance in hard tissue (bone and teeth), apatites have been extensively studied for their potential use as fluorescent lamp phosphors or laser host materials.The apatite have interesting possible application fields such as in medicine as materials of bone filling, coating of dental implants, agro chemicals as artificial fertilizers. The LiPb2Ca2(PO4)3 was synthesized by the solid-state method, its crystal structure was investigated by Rietveld analysis using XRPD data. This material crystallizes with a structure of lacunar apatite anion deficit. The LiPb2Ca2(PO4)3 is hexagonal apatite at room temperature, adopting the space group P63/m (ITA No. 176), Rietveld refinements showed that the site 4f is shared by three cations Ca, Pb and Li. While the 6h is occupied by the Pb and Li cations. The structure can be described as built up from the PO4 tetrahedra and the sixfold coordination cavities, which delimit hexagonal tunnels along the c-axis direction. These tunnels are linked by the cations occupying the 4 f sites. Raman and Infrared spectroscopy analyses were carried out. The observed frequencies were assigned and discussed on the basis of unit-cell group analysis and by comparison to other apatite-type materials.Keywords: apatite, Lacunar, crystal structure, Rietveldmethod, LiPb2Ca2(PO4)3, Phase transition
Procedia PDF Downloads 40426535 An Approach to Practical Determination of Fair Premium Rates in Crop Hail Insurance Using Short-Term Insurance Data
Authors: Necati Içer
Abstract:
Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major difficulty in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.Keywords: crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters
Procedia PDF Downloads 5526534 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling
Authors: Florin Leon, Silvia Curteanu
Abstract:
Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression
Procedia PDF Downloads 30526533 Evaluation of Stress Relief using Ultrasonic Peening in GTAW Welding and Stress Corrosion Cracking (SCC) in Stainless Steel, and Comparison with the Thermal Method
Authors: Hamidreza Mansouri
Abstract:
In the construction industry, the lifespan of a metal structure is directly related to the quality of welding. In most metal structures, the welded area is considered critical and is one of the most important factors in design. To date, many fracture incidents caused by these types of cracks have occurred. Various methods exist to increase the lifespan of welds to prevent failure in the welded area. Among these methods, the application of ultrasonic peening, in addition to the stress relief process, can manually and more precisely adjust the geometry of the weld toe and prevent stress concentration in this part. This research examined Gas Tungsten Arc Welding (GTAW) on common structural steels and 316 stainless steel, which require precise welding, to predict the optimal condition. The GTAW method was used to create residual stress; two samples underwent ultrasonic stress relief, and for comparison, two samples underwent thermal stress relief. Also, no treatment was considered for two samples. The residual stress of all six pieces was measured by X-Ray Diffraction (XRD) method. Then, the two ultrasonically stress-relieved samples and two untreated samples were exposed to a corrosive environment to initiate cracking and determine the effectiveness of the ultrasonic stress relief method. Thus, the residual stress caused by GTAW in the samples decreased by 3.42% with thermal treatment and by 7.69% with ultrasonic peening. Furthermore, the results show that the untreated sample developed cracks after 740 hours, while the ultrasonically stress-relieved piece showed no cracks. Given the high costs of welding and post-welding zone modification processes, finding an economical, effective, and comprehensive method that has the least limitations alongside a broad spectrum of usage is of great importance. Therefore, the impact of various ultrasonic peening stress relief parameters and the selection of the best stress relief parameter to achieve the longest lifespan for the weld area is highly significant.Keywords: GTAW welding, stress corrosion cracking(SCC), thermal method, ultrasonic peening.
Procedia PDF Downloads 5026532 Islamophobia, Years After 9/11: An Assessment of the American Media
Authors: Nasa'i Muhammad Gwadabe
Abstract:
This study seeks to find the extent to which the old Islamophobic prejudice was tilted towards a more negative direction in the United States following the 9/11 terrorist attacks. It is hypothesized that, the 9/11 attacks in the United States reshaped the old Islamophobic prejudice through the reinforcement of a strong social identity construction of Muslims as “out-group”. The “social identity” and “discourse representation” theories are used as framework for analysis. To test the hypothesis, two categories were created: the prejudice (out-group) and the tolerance (in-group) categories. The Prejudice (out-group) against Muslims category was coded to include six attributes: (Terrorist, Threat, Women's Rights violation, Undemocratic, Backward and Intolerant); while the tolerance (In-group) for Muslims category was also coded to include six attributes: (Peaceful, civilized, educated, partners trustworthy and honest). Data are generated from the archives of three American newspapers: The Los Angeles Times, New York Times and USA Today using specific search terms and specific date range; from 9/11/1996 to 9/11/2006, that is five years before and five years after the 9/11. An aggregate of 20595 articles were generated from the search of the three newspapers throughout the search periods. Conclusively, for both pre and post 9/11 periods, the articles generated under the category of prejudice (out-group) against Muslims revealed a higher frequency, against that of tolerance (in-group) for them, which is lesser. Finally, The comparison between the pre and post 9/11 periods showed that, the increased Prejudice (out-group) against Muslims was most influenced through libeling them as terrorist, which signaled a skyrocketed increase from pre to post 9/11.Keywords: in-group, Islam, Islamophobia, Muslims, out-group, prejudice, terrorism, the 9/11 and tolerance
Procedia PDF Downloads 30626531 Analysis of the Environmental Impact of Selected Small Heat and Power Plants Operating in Poland
Authors: M. Stelmachowski, M. Wojtczak
Abstract:
The aim of the work was to assess the environmental impact of the selected small and medium-sized companies supplying heat and electricity to the cities with a population of about 50,000 inhabitants. Evaluation and comparison of the impact on the environment have been carried out for the three plants producing heat and two CHP plants with particular attention to emissions into the atmosphere and the impact of introducing a system of trading carbon emissions of these companies.Keywords: CO2 emission, district heating, heat and power plant, impact on environment
Procedia PDF Downloads 47926530 Algorithm Optimization to Sort in Parallel by Decreasing the Number of the Processors in SIMD (Single Instruction Multiple Data) Systems
Authors: Ali Hosseini
Abstract:
Paralleling is a mechanism to decrease the time necessary to execute the programs. Sorting is one of the important operations to be used in different systems in a way that the proper function of many algorithms and operations depend on sorted data. CRCW_SORT algorithm executes ‘N’ elements sorting in O(1) time on SIMD (Single Instruction Multiple Data) computers with n^2/2-n/2 number of processors. In this article having presented a mechanism by dividing the input string by the hinge element into two less strings the number of the processors to be used in sorting ‘N’ elements in O(1) time has decreased to n^2/8-n/4 in the best state; by this mechanism the best state is when the hinge element is the middle one and the worst state is when it is minimum. The findings from assessing the proposed algorithm by other methods on data collection and number of the processors indicate that the proposed algorithm uses less processors to sort during execution than other methods.Keywords: CRCW, SIMD (Single Instruction Multiple Data) computers, parallel computers, number of the processors
Procedia PDF Downloads 31026529 Increasing the System Availability of Data Centers by Using Virtualization Technologies
Authors: Chris Ewe, Naoum Jamous, Holger Schrödl
Abstract:
Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization
Procedia PDF Downloads 53726528 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt
Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer
Abstract:
Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening
Procedia PDF Downloads 5326527 The Model Development of Caregiver Skills for the End of Life’s Cancer Patients
Authors: Chaliya Wamaloon, Malee Chaisaena, Nusara Prasertsri
Abstract:
Informal caregivers providing home-based palliative and end-of-life (EOL) care to people with advanced cancer is needed, however, there has not been develop caregiver skills for the EOL in cancer patients. The aim of this research was to study the model development of caregiver skills for the EOL in cancer patients. Mixed methods research was conducted in 3 phases. All subjects were in Ubon Rathchathani Cancer Hospital including 30 EOL cancer patient caregivers, 30 EOL cancer patients, and 111 health care professionals who provided care for the EOL cancer patients and 30 EOL target participants who had been trained to be cancer patient caregivers. The research tools were questionnaires, semi structured interviews, and caregiver skills questionnaires. Data were analyzed by using percentage, mean, standard deviation, pair t-test, and content analysis. The result from this study showed the model development of caregiver skills for cancer patients consisted of 9 domains skills: 1. monitoring, 2. interpreting, 3. making decisions, 4. taking action, 5. making adjustments, 6. providing hands-on care, 7. accessing resources, 8. working together with the ill patients, and 9. navigating the healthcare system. The model composed of skills development curriculum for cancer patient caregivers, Manual of palliative care for caregivers, diary of health care records for cancer patients, and the evaluation model of development of caregiver skills for EOL cancer patients. The results of the evaluation in the development model of caregiver skills for EOL cancer patients showed that the caregivers were satisfied with the model of development for caregiver skills at a high level. The comparison of the caregiver skills before and after obtaining the development of caregivers skills revealed that it improved at a statistically significant level (p < 0.05).Keywords: caregiver, caregiver skills, cancer patients, end of life
Procedia PDF Downloads 16926526 Review of Different Machine Learning Algorithms
Authors: Syed Romat Ali Shah, Bilal Shoaib, Saleem Akhtar, Munib Ahmad, Shahan Sadiqui
Abstract:
Classification is a data mining technique, which is recognizedon Machine Learning (ML) algorithm. It is used to classifythe individual articlein a knownofinformation into a set of predefinemodules or group. Web mining is also a portion of that sympathetic of data mining methods. The main purpose of this paper to analysis and compare the performance of Naïve Bayse Algorithm, Decision Tree, K-Nearest Neighbor (KNN), Artificial Neural Network (ANN)and Support Vector Machine (SVM). This paper consists of different ML algorithm and their advantages and disadvantages and also define research issues.Keywords: Data Mining, Web Mining, classification, ML Algorithms
Procedia PDF Downloads 30326525 Using Genetic Algorithms and Rough Set Based Fuzzy K-Modes to Improve Centroid Model Clustering Performance on Categorical Data
Authors: Rishabh Srivastav, Divyam Sharma
Abstract:
We propose an algorithm to cluster categorical data named as ‘Genetic algorithm initialized rough set based fuzzy K-Modes for categorical data’. We propose an amalgamation of the simple K-modes algorithm, the Rough and Fuzzy set based K-modes and the Genetic Algorithm to form a new algorithm,which we hypothesise, will provide better Centroid Model clustering results, than existing standard algorithms. In the proposed algorithm, the initialization and updation of modes is done by the use of genetic algorithms while the membership values are calculated using the rough set and fuzzy logic.Keywords: categorical data, fuzzy logic, genetic algorithm, K modes clustering, rough sets
Procedia PDF Downloads 24726524 Forecasting Amman Stock Market Data Using a Hybrid Method
Authors: Ahmad Awajan, Sadam Al Wadi
Abstract:
In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series
Procedia PDF Downloads 12926523 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems
Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell
Abstract:
Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.Keywords: building information modeling, BIM, facilities management systems, interoperability, information management
Procedia PDF Downloads 11626522 Micro-Channel Flows Simulation Based on Nonlinear Coupled Constitutive Model
Authors: Qijiao He
Abstract:
MicroElectrical-Mechanical System (MEMS) is one of the most rapidly developing frontier research field both in theory study and applied technology. Micro-channel is a very important link component of MEMS. With the research and development of MEMS, the size of the micro-devices and the micro-channels becomes further smaller. Compared with the macroscale flow, the flow characteristics of gas in the micro-channel have changed, and the rarefaction effect appears obviously. However, for the rarefied gas and microscale flow, Navier-Stokes-Fourier (NSF) equations are no longer appropriate due to the breakup of the continuum hypothesis. A Nonlinear Coupled Constitutive Model (NCCM) has been derived from the Boltzmann equation to describe the characteristics of both continuum and rarefied gas flows. We apply the present scheme to simulate continuum and rarefied gas flows in a micro-channel structure. And for comparison, we apply other widely used methods which based on particle simulation or direct solution of distribution function, such as Direct simulation of Monte Carlo (DSMC), Unified Gas-Kinetic Scheme (UGKS) and Lattice Boltzmann Method (LBM), to simulate the flows. The results show that the present solution is in better agreement with the experimental data and the DSMC, UGKS and LBM results than the NSF results in rarefied cases but is in good agreement with the NSF results in continuum cases. And some characteristics of both continuum and rarefied gas flows are observed and analyzed.Keywords: continuum and rarefied gas flows, discontinuous Galerkin method, generalized hydrodynamic equations, numerical simulation
Procedia PDF Downloads 17226521 Nonlinear Analysis of Torsionally Loaded Steel Fibred Self-Compacted Concrete Beams Reinforced by GFRP Bars
Authors: Khaled Saad Eldin Mohamed Ragab
Abstract:
This paper investigates analytically the torsion behavior of steel fibered high strength self compacting concrete beams reinforced by GFRP bars. Nonlinear finite element analysis on 12 beams specimens was achieved by using ANSYS software. The nonlinear finite element analysis program ANSYS is utilized owing to its capabilities to predict either the response of reinforced concrete beams in the post elastic range or the ultimate strength of a reinforced concrete beams produced from steel fiber reinforced self compacting concrete (SFRSCC) and reinforced by GFRP bars. A general description of the finite element method, theoretical modeling of concrete and reinforcement are presented. In order to verify the analytical model used in this research using test results of the experimental data, the finite element analysis were performed. Then, a parametric study of the effect ratio of volume fraction of steel fibers in ordinary strength concrete, the effect ratio of volume fraction of steel fibers in high strength concrete, and the type of reinforcement of stirrups were investigated. A comparison between the experimental results and those predicted by the existing models are presented. Results and conclusions thyat may be useful for designers have been raised and represented.Keywords: nonlinear analysis, torsionally loaded, self compacting concrete, steel fiber reinforced self compacting concrete (SFRSCC), GFRP bars and sheets
Procedia PDF Downloads 45326520 Mapping the Digital Landscape: An Analysis of Party Differences between Conventional and Digital Policy Positions
Authors: Daniel Schwarz, Jan Fivaz, Alessia Neuroni
Abstract:
Although digitization is a buzzword in almost every election campaign, the political parties leave voters largely in the dark about their specific positions on digital issues. In the run-up to the 2019 elections in Switzerland, the ‘Digitization Monitor’ project (DMP) was launched in order to change this situation. Within the framework of the DMP, all 4,736 candidates were surveyed about their digital policy positions and values. The DMP is designed as a digital policy supplement to the existing ‘smartvote’ voting advice application. This enabled a direct comparison of the digital policy attitudes according to the DMP with the topics of the ‘smartvote’ questionnaire which are comprehensive in content but mainly related to conventional policy areas. This paper’s main research goal is to analyze and visualize possible differences between conventional and digital policy areas in terms of response patterns between and within political parties. The analysis is based on dimensionality reduction methods (multidimensional scaling and principal component analysis) for the visualization of inter-party differences, and on standard deviation as a measure of variation for the evaluation of intra-party unity. The results reveal that digital issues show a lower degree of inter-party polarization compared to conventional policy areas. Thus, the parties have more common ground in issues on digitization than in conventional policy areas. In contrast, the study reveals a mixed picture regarding intra-party unity. Homogeneous parties show a lower degree of unity in digitization issues whereas parties with heterogeneous positions in conventional areas have more united positions in digital areas. All things considered, the findings are encouraging as less polarized conditions apply to the debate on digital development compared to conventional politics. For the future, it would be desirable if in further countries similar projects to the DMP could emerge to broaden the basis for conclusions.Keywords: comparison of political issue dimensions, digital awareness of candidates, digital policy space, party positions on digital issues
Procedia PDF Downloads 18626519 Investigating Cloud Forensics: Challenges, Tools, and Practical Case Studies
Authors: Noha Badkook, Maryam Alsubaie, Samaher Dawood, Enas Khairallah
Abstract:
Cloud computing has introduced transformative benefits in data storage and accessibility while posing unique forensic challenges. This paper explores cloud forensics, focusing on investigating and analyzing evidence from cloud environments to address issues such as unauthorized data access, manipulation, and breaches. The research highlights the practical use of opensource forensic tools like Autopsy and Bulk Extractor in realworld scenarios, including unauthorized data sharing via Google Drive and the misuse of personal cloud storage for sensitive information leaks. This work underscores the growing importance of robust forensic procedures and accessible tools in ensuring data security and accountability in cloud ecosystems.Keywords: cloud forensic, tools, challenge, autopsy, bulk extractor
Procedia PDF Downloads 326518 Improving Contributions to the Strengthening of the Legislation Regarding Road Infrastructure Safety Management in Romania, Case Study: Comparison Between the Initial Regulations and the Clarity of the Current Regulations - Trends Regarding the Efficiency
Authors: Corneliu-Ioan Dimitriu, Gheorghe Frățilă
Abstract:
Romania and Bulgaria have high rates of road deaths per million inhabitants. Directive (EU) 2019/1936, known as the RISM Directive, has been transposed into national law by each Member State. The research focuses on the amendments made to Romanian legislation through Government Ordinance no. 3/2022, which aims to improve road safety management on infrastructure. The aim of the research is two-fold: to sensitize the Romanian Government and decision-making entities to develop an integrated and competitive management system and to establish a safe and proactive mobility system that ensures efficient and safe roads. The research includes a critical analysis of European and Romanian legislation, as well as subsequent normative acts related to road infrastructure safety management. Public data from European Union and national authorities, as well as data from the Romanian Road Authority-ARR and Traffic Police database, are utilized. The research methodology involves comparative analysis, criterion analysis, SWOT analysis, and the use of GANTT and WBS diagrams. The Excel tool is employed to process the road accident databases of Romania and Bulgaria. Collaboration with Bulgarian specialists is established to identify common road infrastructure safety issues. The research concludes that the legislative changes have resulted in a relaxation of road safety management in Romania, leading to decreased control over certain management procedures. The amendments to primary and secondary legislation do not meet the current safety requirements for road infrastructure. The research highlights the need for legislative changes and strengthened administrative capacity to enhance road safety. Regional cooperation and the exchange of best practices are emphasized for effective road infrastructure safety management. The research contributes to the theoretical understanding of road infrastructure safety management by analyzing legislative changes and their impact on safety measures. It highlights the importance of an integrated and proactive approach in reducing road accidents and achieving the "zero deaths" objective set by the European Union. Data collection involves accessing public data from relevant authorities and using information from the Romanian Road Authority-ARR and Traffic Police database. Analysis procedures include critical analysis of legislation, comparative analysis of transpositions, criterion analysis, and the use of various diagrams and tools such as SWOT, GANTT, WBS, and Excel. The research addresses the effectiveness of legislative changes in road infrastructure safety management in Romania and the impact on control over management procedures. It also explores the need for strengthened administrative capacity and regional cooperation in addressing road safety issues. The research concludes that the legislative changes made in Romania have not strengthened road safety management and emphasize the need for immediate action, legislative amendments, and enhanced administrative capacity. Collaboration with Bulgarian specialists and the exchange of best practices are recommended for effective road infrastructure safety management. The research contributes to the theoretical understanding of road safety management and provides valuable insights for policymakers and decision-makers in Romania.Keywords: management, road infrastructure safety, legislation, amendments, collaboration
Procedia PDF Downloads 84