Search results for: ground data
25463 The Importance of Localization in Large Constraction Projects
Authors: Ali Mohammadi
Abstract:
The basis for the construction of any project is a map, a map where the surveyor can determine the coordinates of the points on the ground by using the coordinates and using the total station, projects such as dams, roads, tunnels and pipelines, if the base points are determined using GPS prepared can create challenges for the surveyor to control. First, we will examine some map projection on which the maps are designed, and a summary of their differences and the challenges that surveyors face in order to control them, because in order to build projects, we need real lengths and angles, so we have to use coordinates that provide us with the results of the calculations. We will examine some examples to understand the concept of localization so that the surveyor knows if he is facing a challenge or not and if he is faced with this challenge, how should he solve this problem.Keywords: UTM, scale factor, cartesian, traverse
Procedia PDF Downloads 8025462 Fuzzy Expert Systems Applied to Intelligent Design of Data Centers
Authors: Mario M. Figueroa de la Cruz, Claudia I. Solorzano, Raul Acosta, Ignacio Funes
Abstract:
This technological development project seeks to create a tool that allows companies, in need of implementing a Data Center, intelligently determining factors for allocating resources support cooling and power supply (UPS) in its conception. The results should show clearly the speed, robustness and reliability of a system designed for deployment in environments where they must manage and protect large volumes of data.Keywords: telecommunications, data center, fuzzy logic, expert systems
Procedia PDF Downloads 34525461 Space Debris Mitigation: Solutions from the Dark Skies of the Remote Australian Outback Using a Proposed Network of Mobile Astronomical Observatories
Authors: Muhammad Akbar Hussain, Muhammad Mehdi Hussain, Waqar Haider
Abstract:
There are tens of thousands of undetected and uncatalogued pieces of space debris in the Low Earth Orbit (LEO). They are not only difficult to be detected and tracked, their sheer number puts active satellites and humans in orbit around Earth into danger. With the entry of more governments and private companies into harnessing the Earth’s orbit for communication, research and military purposes, there is an ever-increasing need for not only the detection and cataloguing of these pieces of space debris, it is time to take measures to take them out and clean up the space around Earth. Current optical and radar-based Space Situational Awareness initiatives are useful mostly in detecting and cataloguing larger pieces of debris mainly for avoidance measures. Smaller than 10 cm pieces are in a relatively dark zone, yet these are deadly and capable of destroying satellites and human missions. A network of mobile observatories, connected to each other in real time and working in unison as a single instrument, may be able to detect small pieces of debris and achieve effective triangulation to help create a comprehensive database of their trajectories and parameters to the highest level of precision. This data may enable ground-based laser systems to help deorbit individual debris. Such a network of observatories can join current efforts in detection and removal of space debris in Earth’s orbit.Keywords: space debris, low earth orbit, mobile observatories, triangulation, seamless operability
Procedia PDF Downloads 16625460 Understanding the Complexities of Consumer Financial Spinning
Authors: Olivier Mesly
Abstract:
This research presents a conceptual framework termed “Consumer Financial Spinning” (CFS) to analyze consumer behavior in the financial/economic markets. This phenomenon occurs when consumers of high-stakes financial products accumulate unsustainable debt, leading them to detach from their initial financial hierarchy of needs, wealth-related goals, and preferences regarding their household portfolio of assets. The daring actions of these consumers, forming a dark financial triangle, are characterized by three behaviors: overconfidence, the use of rationed rationality, and deceitfulness. We show that we can incorporate CFS into the traditional CAPM and Markovitz’ portfolio optimization models to create a framework that explains such market phenomena as the global financial crisis, highlighting the antecedents and consequences of ill-conceived speculation. Because this is a conceptual paper, there is no methodology with respect to ground studies. However, we apply modeling principles derived from the data percolation methodology, which contains tenets explicating how to structure concepts. A simulation test of the proposed framework is conducted; it demonstrates the conditions under which the relationship between expected returns and risk may deviate from linearity. The analysis and conceptual findings are particularly relevant both theoretically and pragmatically as they shed light on the psychological conditions that drive intense speculation, which can lead to market turmoil. Armed with such understanding, regulators are better equipped to propose solutions before the economic problems become out of control.Keywords: consumer financial spinning, rationality, deceitfulness, overconfidence, CAPM
Procedia PDF Downloads 4825459 The Effect of Training and Development Practice on Employees’ Performance
Authors: Sifen Abreham
Abstract:
Employees are resources in organizations; as such, they need to be trained and developed properly to achieve an organization's goals and expectations. The initial development of the human resource management concept is based on the effective utilization of people to treat them as resources, leading to the realization of business strategies and organizational objectives. The study aimed to assess the effect of training and development practices on employee performance. The researcher used an explanatory research design, which helps to explain, understand, and predict the relationship between variables. To collect the data from the respondents, the study used probability sampling. From the probability, the researcher used stratified random sampling, which can branch off the entire population into homogenous groups. The result was analyzed and presented by using the statistical package for the social science (SPSS) version 26. The major finding of the study was that the training has an impact on employees' job performance to achieve organizational objectives. The district has a policy and procedure for training and development, but it doesn’t apply actively, and it’s not suitable for district-advised reform this policy and procedure and applied actively; the district gives training for the majority of its employees, but most of the time, the training is theoretical the district advised to use practical training method to see positive change, the district gives evaluation after the employees take training and development, but the result is not encouraging the district advised to assess employees skill gap and feel that gap, the district has a budget, but it’s not adequate the district advised to strengthen its financial ground.Keywords: training, development, employees, performance, policy
Procedia PDF Downloads 5825458 Numerical Model of Low Cost Rubber Isolators for Masonry Housing in High Seismic Regions
Authors: Ahmad B. Habieb, Gabriele Milani, Tavio Tavio, Federico Milani
Abstract:
Housings in developing countries have often inadequate seismic protection, particularly for masonry. People choose this type of structure since the cost and application are relatively cheap. Seismic protection of masonry remains an interesting issue among researchers. In this study, we develop a low-cost seismic isolation system for masonry using fiber reinforced elastomeric isolators. The elastomer proposed consists of few layers of rubber pads and fiber lamina, making it lower in cost comparing to the conventional isolators. We present a finite element (FE) analysis to predict the behavior of the low cost rubber isolators undergoing moderate deformations. The FE model of the elastomer involves a hyperelastic material property for the rubber pad. We adopt a Yeoh hyperelasticity model and estimate its coefficients through the available experimental data. Having the shear behavior of the elastomers, we apply that isolation system onto small masonry housing. To attach the isolators on the building, we model the shear behavior of the isolation system by means of a damped nonlinear spring model. By this attempt, the FE analysis becomes computationally inexpensive. Several ground motion data are applied to observe its sensitivity. Roof acceleration and tensile damage of walls become the parameters to evaluate the performance of the isolators. In this study, a concrete damage plasticity model is used to model masonry in the nonlinear range. This tool is available in the standard package of Abaqus FE software. Finally, the results show that the low-cost isolators proposed are capable of reducing roof acceleration and damage level of masonry housing. Through this study, we are also capable of monitoring the shear deformation of isolators during seismic motion. It is useful to determine whether the isolator is applicable. According to the results, the deformations of isolators on the benchmark one story building are relatively small.Keywords: masonry, low cost elastomeric isolator, finite element analysis, hyperelasticity, damped non-linear spring, concrete damage plasticity
Procedia PDF Downloads 28625457 Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems
Authors: Baris Can Yalcin
Abstract:
Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.Keywords: design, mechatronics, motion sensor, data acquisition
Procedia PDF Downloads 58825456 Designing Offshore Pipelines Facing the Geohazard of Active Seismic Faults
Authors: Maria Trimintziou, Michael Sakellariou, Prodromos Psarropoulos
Abstract:
Nowadays, the exploitation of hydrocarbons reserves in deep seas and oceans, in combination with the need to transport hydrocarbons among countries, has made the design, construction and operation of offshore pipelines very significant. Under this perspective, it is evident that many more offshore pipelines are expected to be constructed in the near future. Since offshore pipelines are usually crossing extended areas, they may face a variety of geohazards that impose substantial permanent ground deformations (PGDs) to the pipeline and potentially threaten its integrity. In case of a geohazard area, there exist three options to proceed. The first option is to avoid the problematic area through rerouting, which is usually regarded as an unfavorable solution due to its high cost. The second is to apply (if possible) mitigation/protection measures in order to eliminate the geohazard itself. Finally, the last appealing option is to allow the pipeline crossing through the geohazard area, provided that the pipeline will have been verified against the expected PGDs. In areas with moderate or high seismicity the design of an offshore pipeline is more demanding due to the earthquake-related geohazards, such as landslides, soil liquefaction phenomena, and active faults. It is worthy to mention that although worldwide there is a great experience in offshore geotechnics and pipeline design, the experience in seismic design of offshore pipelines is rather limited due to the fact that most of the pipelines have been constructed in non-seismic regions (e.g. North Sea, West Australia, Gulf of Mexico, etc.). The current study focuses on the seismic design of offshore pipelines against active faults. After an extensive literature review of the provisions of the seismic norms worldwide and of the available analytical methods, the study simulates numerically (through finite-element modeling and strain-based criteria) the distress of offshore pipelines subjected to PGDs induced by active seismic faults at the seabed. Factors, such as the geometrical properties of the fault, the mechanical properties of the ruptured soil formations, and the pipeline characteristics, are examined. After some interesting conclusions regarding the seismic vulnerability of offshore pipelines, potential cost-effective mitigation measures are proposed taking into account constructability issues.Keywords: offhore pipelines, seismic design, active faults, permanent ground deformations (PGDs)
Procedia PDF Downloads 58825455 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS
Authors: Eunsu Jang, Kang Park
Abstract:
In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis
Procedia PDF Downloads 40125454 Speed Characteristics of Mixed Traffic Flow on Urban Arterials
Authors: Ashish Dhamaniya, Satish Chandra
Abstract:
Speed and traffic volume data are collected on different sections of four lane and six lane roads in three metropolitan cities in India. Speed data are analyzed to fit the statistical distribution to individual vehicle speed data and all vehicles speed data. It is noted that speed data of individual vehicle generally follows a normal distribution but speed data of all vehicle combined at a section of urban road may or may not follow the normal distribution depending upon the composition of traffic stream. A new term Speed Spread Ratio (SSR) is introduced in this paper which is the ratio of difference in 85th and 50th percentile speed to the difference in 50th and 15th percentile speed. If SSR is unity then speed data are truly normally distributed. It is noted that on six lane urban roads, speed data follow a normal distribution only when SSR is in the range of 0.86 – 1.11. The range of SSR is validated on four lane roads also.Keywords: normal distribution, percentile speed, speed spread ratio, traffic volume
Procedia PDF Downloads 42225453 An Exploratory Analysis of Brisbane's Commuter Travel Patterns Using Smart Card Data
Authors: Ming Wei
Abstract:
Over the past two decades, Location Based Service (LBS) data have been increasingly applied to urban and transportation studies due to their comprehensiveness and consistency. However, compared to other LBS data including mobile phone data, GPS and social networking platforms, smart card data collected from public transport users have arguably yet to be fully exploited in urban systems analysis. By using five weekdays of passenger travel transaction data taken from go card – Southeast Queensland’s transit smart card – this paper analyses the spatiotemporal distribution of passenger movement with regard to the land use patterns in Brisbane. Work and residential places for public transport commuters were identified after extracting journeys-to-work patterns. Our results show that the locations of the workplaces identified from the go card data and residential suburbs are largely consistent with those that were marked in the land use map. However, the intensity for some residential locations in terms of population or commuter densities do not match well between the map and those derived from the go card data. This indicates that the misalignment between residential areas and workplaces to a certain extent, shedding light on how enhancements to service management and infrastructure expansion might be undertaken.Keywords: big data, smart card data, travel pattern, land use
Procedia PDF Downloads 28525452 A Review on Application of Waste Tire in Concrete
Authors: M. A. Yazdi, J. Yang, L. Yihui, H. Su
Abstract:
The application of recycle waste tires into civil engineering practices, namely asphalt paving mixtures and cementbased materials has been gaining ground across the world. This review summarizes and compares the recent achievements in the area of plain rubberized concrete (PRC), in details. Different treatment methods have been discussed to improve the performance of rubberized Portland cement concrete. The review also includes the effects of size and amount of tire rubbers on mechanical and durability properties of PRC. The microstructure behaviour of the rubberized concrete was detailed.Keywords: waste rubber aggregates, microstructure, treatment methods, size and content effects
Procedia PDF Downloads 33225451 Pattern Recognition Using Feature Based Die-Map Clustering in the Semiconductor Manufacturing Process
Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek
Abstract:
Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.Keywords: die-map clustering, feature extraction, pattern recognition, semiconductor manufacturing process
Procedia PDF Downloads 40225450 Spatial Integrity of Seismic Data for Oil and Gas Exploration
Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof
Abstract:
Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow
Procedia PDF Downloads 22225449 An Optimal Hybrid EMS System for a Hyperloop Prototype Vehicle
Authors: J. F. Gonzalez-Rojo, Federico Lluesma-Rodriguez, Temoatzin Gonzalez
Abstract:
Hyperloop, a new mode of transport, is gaining significance. It consists of the use of a ground-based transport system which includes a levitation system, that avoids rolling friction forces, and which has been covered with a tube, controlling the inner atmosphere lowering the aerodynamic drag forces. Thus, hyperloop is proposed as a solution to the current limitation on ground transportation. Rolling and aerodynamic problems, that limit large speeds for traditional high-speed rail or even maglev systems, are overcome using a hyperloop solution. Zeleros is one of the companies developing technology for hyperloop application worldwide. It is working on a concept that reduces the infrastructure cost and minimizes the power consumption as well as the losses associated with magnetic drag forces. For this purpose, Zeleros proposes a Hybrid ElectroMagnetic Suspension (EMS) for its prototype. In the present manuscript an active and optimal electromagnetic suspension levitation method based on nearly zero power consumption individual modules is presented. This system consists of several hybrid permanent magnet-coil levitation units that can be arranged along the vehicle. The proposed unit manages to redirect the magnetic field along a defined direction forming a magnetic circuit and minimizing the loses due to field dispersion. This is achieved using an electrical steel core. Each module can stabilize the gap distance using the coil current and either linear or non-linear control methods. The ratio between weight and levitation force for each unit is 1/10. In addition, the quotient between the lifted weight and power consumption at the target gap distance is 1/3 [kg/W]. One degree of freedom (DoF) (along the gap direction) is controlled by a single unit. However, when several units are present, a 5 DoF control (2 translational and 3 rotational) can be achieved, leading to the full attitude control of the vehicle. The proposed system has been successfully tested reaching TRL-4 in a laboratory test bench and is currently in TRL-5 state development if the module association in order to control 5 DoF is considered.Keywords: active optimal control, electromagnetic levitation, HEMS, high-speed transport, hyperloop
Procedia PDF Downloads 14625448 Evaluating Data Maturity in Riyadh's Nonprofit Sector: Insights Using the National Data Maturity Index (NDI)
Authors: Maryam Aloshan, Imam Mohammad Ibn Saud, Ahmad Khudair
Abstract:
This study assesses the data governance maturity of nonprofit organizations in Riyadh, Saudi Arabia, using the National Data Maturity Index (NDI) framework developed by the Saudi Data and Artificial Intelligence Authority (SDAIA). Employing a survey designed around the NDI model, data maturity levels were evaluated across 14 dimensions using a 5-point Likert scale. The results reveal a spectrum of maturity levels among the organizations surveyed: while some medium-sized associations reached the ‘Defined’ stage, others, including large associations, fell within the ‘Absence of Capabilities’ or ‘Building’ phases, with no organizations achieving the advanced ‘Established’ or ‘Pioneering’ levels. This variation suggests an emerging recognition of data governance but underscores the need for targeted interventions to bridge the maturity gap. The findings point to a significant opportunity to elevate data governance capabilities in Saudi nonprofits through customized capacity-building initiatives, including training, mentorship, and best practice sharing. This study contributes valuable insights into the digital transformation journey of the Saudi nonprofit sector, aligning with national goals for data-driven governance and organizational efficiency.Keywords: nonprofit organizations-national data maturity index (NDI), Saudi Arabia- SDAIA, data governance, data maturity
Procedia PDF Downloads 1525447 Single-Cell Visualization with Minimum Volume Embedding
Authors: Zhenqiu Liu
Abstract:
Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method
Procedia PDF Downloads 22825446 Cloud Data Security Using Map/Reduce Implementation of Secret Sharing Schemes
Authors: Sara Ibn El Ahrache, Tajje-eddine Rachidi, Hassan Badir, Abderrahmane Sbihi
Abstract:
Recently, there has been increasing confidence for a favorable usage of big data drawn out from the huge amount of information deposited in a cloud computing system. Data kept on such systems can be retrieved through the network at the user’s convenience. However, the data that users send include private information, and therefore, information leakage from these data is now a major social problem. The usage of secret sharing schemes for cloud computing have lately been approved to be relevant in which users deal out their data to several servers. Notably, in a (k,n) threshold scheme, data security is assured if and only if all through the whole life of the secret the opponent cannot compromise more than k of the n servers. In fact, a number of secret sharing algorithms have been suggested to deal with these security issues. In this paper, we present a Mapreduce implementation of Shamir’s secret sharing scheme to increase its performance and to achieve optimal security for cloud data. Different tests were run and through it has been demonstrated the contributions of the proposed approach. These contributions are quite considerable in terms of both security and performance.Keywords: cloud computing, data security, Mapreduce, Shamir's secret sharing
Procedia PDF Downloads 30625445 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills
Authors: Kyle De Freitas, Margaret Bernard
Abstract:
Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.Keywords: educational data mining, learning management system, learning analytics, EDM framework
Procedia PDF Downloads 32625444 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction
Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto
Abstract:
Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data
Procedia PDF Downloads 10525443 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models
Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling
Abstract:
Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.Keywords: supplier selection, automotive supply chains, ANN, GEP
Procedia PDF Downloads 63125442 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method
Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito
Abstract:
In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.
Procedia PDF Downloads 49325441 Investigating Underground Explosion-Like Sounds in Sarableh City and Its Possible Connection with Geological Hazards
Authors: Hosein Almasikia
Abstract:
Sarableh City is located in the west of Iran and in the seismic zone of Zagros. After the Azgole-Sarpol Zahab earthquake with a magnitude of 3.7 Richter on November 21, 2016, in some parts of Sarableh city, horrible sounds were heard by people. There is also a sound similar to the wear of the mill by some of the residents. Vibration studies and field investigations showed that these sounds have a geological origin and are emitted from the ground to the surface and may be related to geological hazards such as landslides, collapse of karstic zones, etc. In this study, an attempt has been made to investigate the possible relationship between these abnormal sounds and geological hazards.Keywords: Sarable, Zagros, landslide, karstic zone
Procedia PDF Downloads 6425440 Superhydrophobic Materials: A Promising Way to Enhance Resilience of Electric System
Authors: M. Balordi, G. Santucci de Magistris, F. Pini, P. Marcacci
Abstract:
The increasing of extreme meteorological events represents the most important causes of damages and blackouts of the whole electric system. In particular, the icing on ground-wires and overheads lines, due to snowstorms or harsh winter conditions, very often gives rise to the collapse of cables and towers both in cold and warm climates. On the other hand, the high concentration of contaminants in the air, due to natural and/or antropic causes, is reflected in high levels of pollutants layered on glass and ceramic insulators, causing frequent and unpredictable flashover events. Overheads line and insulator failures lead to blackouts, dangerous and expensive maintenances and serious inefficiencies in the distribution service. Inducing superhydrophobic (SHP) properties to conductors, ground-wires and insulators, is one of the ways to face all these problems. Indeed, in some cases, the SHP surface can delay the ice nucleation time and decrease the ice nucleation temperature, preventing ice formation. Besides, thanks to the low surface energy, the adhesion force between ice and a superhydrophobic material are low and the ice can be easily detached from the surface. Moreover, it is well known that superhydrophobic surfaces can have self-cleaning properties: these hinder the deposition of pollution and decrease the probability of flashover phenomena. Here this study presents three different studies to impart superhydrophobicity to aluminum, zinc and glass specimens, which represent the main constituent materials of conductors, ground-wires and insulators, respectively. The route to impart the superhydrophobicity to the metallic surfaces can be summarized in a three-step process: 1) sandblasting treatment, 2) chemical-hydrothermal treatment and 3) coating deposition. The first step is required to create a micro-roughness. In the chemical-hydrothermal treatment a nano-scale metallic oxide (Al or Zn) is grown and, together with the sandblasting treatment, bring about a hierarchical micro-nano structure. By coating an alchilated or fluorinated siloxane coating, the surface energy decreases and gives rise to superhydrophobic surfaces. In order to functionalize the glass, different superhydrophobic powders, obtained by a sol-gel synthesis, were prepared. Further, the specimens were covered with a commercial primer and the powders were deposed on them. All the resulting metallic and glass surfaces showed a noticeable superhydrophobic behavior with a very high water contact angles (>150°) and a very low roll-off angles (<5°). The three optimized processes are fast, cheap and safe, and can be easily replicated on industrial scales. The anti-icing and self-cleaning properties of the surfaces were assessed with several indoor lab-tests that evidenced remarkable anti-icing properties and self-cleaning behavior with respect to the bare materials. Finally, to evaluate the anti-snow properties of the samples, some SHP specimens were exposed under real snow-fall events in the RSE outdoor test-facility located in Vinadio, western Alps: the coated samples delay the formation of the snow-sleeves and facilitate the detachment of the snow. The good results for both indoor and outdoor tests make these materials promising for further development in large scale applications.Keywords: superhydrophobic coatings, anti-icing, self-cleaning, anti-snow, overheads lines
Procedia PDF Downloads 18325439 Mitigating Acid Mine Drainage Pollution: A Case Study In the Witwatersrand Area of South Africa
Authors: Elkington Sibusiso Mnguni
Abstract:
In South Africa, mining has been a key economic sector since the discovery of gold in 1886 in the Witwatersrand region, where the city of Johannesburg is located. However, some mines have since been decommissioned, and the continuous pumping of acid mine drainage (AMD) also stopped causing the AMD to rise towards the ground surface. This posed a serious environmental risk to the groundwater resources and river systems in the region. This paper documents the development and extent of the environmental damage as well as the measures implemented by the government to alleviate such damage. The study will add to the body of knowledge on the subject of AMD treatment to prevent environmental degradation. The method used to gather and collate relevant data and information was the desktop study. The key findings include the social and environmental impact of the AMD, which include the pollution of water sources for domestic use leading to skin and other health problems and the loss of biodiversity in some areas. It was also found that the technical intervention of constructing a plant to pump and treat the AMD using the high-density sludge technology was the most effective short-term solution available while a long-term solution was being explored. Some successes and challenges experienced during the implementation of the project are also highlighted. The study will be a useful record of the current status of the AMD treatment interventions in the region.Keywords: acid mine drainage, groundwater resources, pollution, river systems, technical intervention, high density sludge
Procedia PDF Downloads 18625438 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 1725437 Genetic Data of Deceased People: Solving the Gordian Knot
Authors: Inigo de Miguel Beriain
Abstract:
Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people
Procedia PDF Downloads 15425436 Allometric Models for Biomass Estimation in Savanna Woodland Area, Niger State, Nigeria
Authors: Abdullahi Jibrin, Aishetu Abdulkadir
Abstract:
The development of allometric models is crucial to accurate forest biomass/carbon stock assessment. The aim of this study was to develop a set of biomass prediction models that will enable the determination of total tree aboveground biomass for savannah woodland area in Niger State, Nigeria. Based on the data collected through biometric measurements of 1816 trees and destructive sampling of 36 trees, five species specific and one site specific models were developed. The sample size was distributed equally between the five most dominant species in the study site (Vitellaria paradoxa, Irvingia gabonensis, Parkia biglobosa, Anogeissus leiocarpus, Pterocarpus erinaceous). Firstly, the equations were developed for five individual species. Secondly these five species were mixed and were used to develop an allometric equation of mixed species. Overall, there was a strong positive relationship between total tree biomass and the stem diameter. The coefficient of determination (R2 values) ranging from 0.93 to 0.99 P < 0.001 were realised for the models; with considerable low standard error of the estimates (SEE) which confirms that the total tree above ground biomass has a significant relationship with the dbh. The F-test value for the biomass prediction models were also significant at p < 0.001 which indicates that the biomass prediction models are valid. This study recommends that for improved biomass estimates in the study site, the site specific biomass models should preferably be used instead of using generic models.Keywords: allometriy, biomass, carbon stock , model, regression equation, woodland, inventory
Procedia PDF Downloads 44825435 Steps towards the Development of National Health Data Standards in Developing Countries
Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray
Abstract:
The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia
Procedia PDF Downloads 33825434 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map
Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo
Abstract:
Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.Keywords: RDM, multi-source data, big data, U-City
Procedia PDF Downloads 433