Search results for: enterprise data warehouse
24078 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test
Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea
Abstract:
In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence
Procedia PDF Downloads 9724077 Real Time Data Communication with FlightGear Using Simulink Over a UDP Protocol
Authors: Adil Loya, Ali Haider, Arslan A. Ghaffor, Abubaker Siddique
Abstract:
Simulation and modelling of Unmanned Aero Vehicle (UAV) has gained wide popularity in front of aerospace community. The demand of designing and modelling optimized control system for UAV has increased ten folds since last decade. The reason is next generation warfare is dependent on unmanned technologies. Therefore, this research focuses on the simulation of nonlinear UAV dynamics on Simulink and its integration with Flightgear. There has been lots of research on implementation of optimizing control using Simulink, however, there are fewer known techniques to simulate these dynamics over Flightgear and a tedious technique of acquiring data has been tackled in this research horizon. Sending data to Flightgear is easy but receiving it from Simulink is not that straight forward, i.e. we can only receive control data on the output. However, in this research we have managed to get the data out from the Flightgear by implementation of level 2 s-function block within Simulink. Moreover, the results captured from Flightgear over a Universal Datagram Protocol (UDP) communication are then compared with the attitude signal that were sent previously. This provide useful information regarding the difference in outputs attained from Simulink to Flightgear. It was found that values received on Simulink were in high agreement with that of the Flightgear output. And complete study has been conducted in a discrete way.Keywords: aerospace, flight control, flightgear, communication, Simulink
Procedia PDF Downloads 28624076 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 6924075 Prediction of Gully Erosion with Stochastic Modeling by using Geographic Information System and Remote Sensing Data in North of Iran
Authors: Reza Zakerinejad
Abstract:
Gully erosion is a serious problem that threading the sustainability of agricultural area and rangeland and water in a large part of Iran. This type of water erosion is the main source of sedimentation in many catchment areas in the north of Iran. Since in many national assessment approaches just qualitative models were applied the aim of this study is to predict the spatial distribution of gully erosion processes by means of detail terrain analysis and GIS -based logistic regression in the loess deposition in a case study in the Golestan Province. This study the DEM with 25 meter result ion from ASTER data has been used. The Landsat ETM data have been used to mapping of land use. The TreeNet model as a stochastic modeling was applied to prediction the susceptible area for gully erosion. In this model ROC we have set 20 % of data as learning and 20 % as learning data. Therefore, applying the GIS and satellite image analysis techniques has been used to derive the input information for these stochastic models. The result of this study showed a high accurate map of potential for gully erosion.Keywords: TreeNet model, terrain analysis, Golestan Province, Iran
Procedia PDF Downloads 53524074 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran
Authors: Reza Heidari, Maryam Amiri
Abstract:
In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation
Procedia PDF Downloads 40324073 Data Science/Artificial Intelligence: A Possible Panacea for Refugee Crisis
Authors: Avi Shrivastava
Abstract:
In 2021, two heart-wrenching scenes, shown live on television screens across countries, painted a grim picture of refugees. One of them was of people clinging onto an airplane's wings in their desperate attempt to flee war-torn Afghanistan. They ultimately fell to their death. The other scene was the U.S. government authorities separating children from their parents or guardians to deter migrants/refugees from coming to the U.S. These events show the desperation refugees feel when they are trying to leave their homes in disaster zones. However, data paints a grave picture of the current refugee situation. It also indicates that a bleak future lies ahead for the refugees across the globe. Data and information are the two threads that intertwine to weave the shimmery fabric of modern society. Data and information are often used interchangeably, but they differ considerably. For example, information analysis reveals rationale, and logic, while data analysis, on the other hand, reveals a pattern. Moreover, patterns revealed by data can enable us to create the necessary tools to combat huge problems on our hands. Data analysis paints a clear picture so that the decision-making process becomes simple. Geopolitical and economic data can be used to predict future refugee hotspots. Accurately predicting the next refugee hotspots will allow governments and relief agencies to prepare better for future refugee crises. The refugee crisis does not have binary answers. Given the emotionally wrenching nature of the ground realities, experts often shy away from realistically stating things as they are. This hesitancy can cost lives. When decisions are based solely on data, emotions can be removed from the decision-making process. Data also presents irrefutable evidence and tells whether there is a solution or not. Moreover, it also responds to a nonbinary crisis with a binary answer. Because of all that, it becomes easier to tackle a problem. Data science and A.I. can predict future refugee crises. With the recent explosion of data due to the rise of social media platforms, data and insight into data has solved many social and political problems. Data science can also help solve many issues refugees face while staying in refugee camps or adopted countries. This paper looks into various ways data science can help solve refugee problems. A.I.-based chatbots can help refugees seek legal help to find asylum in the country they want to settle in. These chatbots can help them find a marketplace where they can find help from the people willing to help. Data science and technology can also help solve refugees' many problems, including food, shelter, employment, security, and assimilation. The refugee problem seems to be one of the most challenging for social and political reasons. Data science and machine learning can help prevent the refugee crisis and solve or alleviate some of the problems that refugees face in their journey to a better life. With the explosion of data in the last decade, data science has made it possible to solve many geopolitical and social issues.Keywords: refugee crisis, artificial intelligence, data science, refugee camps, Afghanistan, Ukraine
Procedia PDF Downloads 7224072 Critical Success Factors of OCOP Business Model in Pattani Province Thailand: A Qualitative Approach
Authors: Poonsook Thatchaopas, Nik Kamariah Nikmat, Nattakarn Eakuru
Abstract:
Since 2003, the Thai Government has implemented several initiatives to encourage and incubate entrepreneurial skills and motivation among her citizens. One of the initiatives is the “One College One Product” business model or well known as ‘OCOP’, launched by the Vocational Education Commission to encourage partnership between college students to choose at least one product for business venture. In line with this mission, several business enterprises were established such as food products, restaurants, spa, Thai massage, minimart, computer maintenance, karaoke centre, internet café, mini theater etc. Currently, these business incubator projects can be observed at 404 vocational colleges and 21 incubation centres to encourage entrepreneurial small and medium enterprise (SME) development. However, the number of successful OCOP projects is still minimal. Out of the 404 individual OCOP projects at Vocational Colleges around Thailand, very few became successful. The objective of this paper is to identify the critical success factors needed to be a successful OCOP business entrepreneur. This study uses qualitative method by interviewing business partners of an OCOP business called Crispy Roti Krua Acheeva Brand (CRKAB). It is a snack food company that is developed at Pattani Vocational College in South Thailand. This project was initiated by three female entrepreneurs who were alumni student cum owners of the CRKAB. The finding shows that the main critical success factors are self-confidence, creativity or innovativeness, knowledge, skills and perseverance. Additionally, they reiterated that the keys to business success are product quality, perceived price, promotion, branding, new packaging to increase sales and continuous developments. The results implies for a student business SME to be successful, the company should have credible partners and effective marketing plan.Keywords: student entrepreneurship, business incubator, food industry, qualitative, Thailand
Procedia PDF Downloads 39224071 A Spatial Point Pattern Analysis to Recognize Fail Bit Patterns in Semiconductor Manufacturing
Authors: Youngji Yoo, Seung Hwan Park, Daewoong An, Sung-Shick Kim, Jun-Geol Baek
Abstract:
The yield management system is very important to produce high-quality semiconductor chips in the semiconductor manufacturing process. In order to improve quality of semiconductors, various tests are conducted in the post fabrication (FAB) process. During the test process, large amount of data are collected and the data includes a lot of information about defect. In general, the defect on the wafer is the main causes of yield loss. Therefore, analyzing the defect data is necessary to improve performance of yield prediction. The wafer bin map (WBM) is one of the data collected in the test process and includes defect information such as the fail bit patterns. The fail bit has characteristics of spatial point patterns. Therefore, this paper proposes the feature extraction method using the spatial point pattern analysis. Actual data obtained from the semiconductor process is used for experiments and the experimental result shows that the proposed method is more accurately recognize the fail bit patterns.Keywords: semiconductor, wafer bin map, feature extraction, spatial point patterns, contour map
Procedia PDF Downloads 38424070 The Measurement of the Multi-Period Efficiency of the Turkish Health Care Sector
Authors: Erhan Berk
Abstract:
The purpose of this study is to examine the efficiency and productivity of the health care sector in Turkey based on four years of health care cross-sectional data. Efficiency measures are calculated by a nonparametric approach known as Data Envelopment Analysis (DEA). Productivity is measured by the Malmquist index. The research shows how DEA-based Malmquist productivity index can be operated to appraise the technology and productivity changes resulted in the Turkish hospitals which are located all across the country.Keywords: data envelopment analysis, efficiency, health care, Malmquist Index
Procedia PDF Downloads 33524069 Small Scale Mobile Robot Auto-Parking Using Deep Learning, Image Processing, and Kinematics-Based Target Prediction
Authors: Mingxin Li, Liya Ni
Abstract:
Autonomous parking is a valuable feature applicable to many robotics applications such as tour guide robots, UV sanitizing robots, food delivery robots, and warehouse robots. With auto-parking, the robot will be able to park at the charging zone and charge itself without human intervention. As compared to self-driving vehicles, auto-parking is more challenging for a small-scale mobile robot only equipped with a front camera due to the camera view limited by the robot’s height and the narrow Field of View (FOV) of the inexpensive camera. In this research, auto-parking of a small-scale mobile robot with a front camera only was achieved in a four-step process: Firstly, transfer learning was performed on the AlexNet, a popular pre-trained convolutional neural network (CNN). It was trained with 150 pictures of empty parking slots and 150 pictures of occupied parking slots from the view angle of a small-scale robot. The dataset of images was divided into a group of 70% images for training and the remaining 30% images for validation. An average success rate of 95% was achieved. Secondly, the image of detected empty parking space was processed with edge detection followed by the computation of parametric representations of the boundary lines using the Hough Transform algorithm. Thirdly, the positions of the entrance point and center of available parking space were predicted based on the robot kinematic model as the robot was driving closer to the parking space because the boundary lines disappeared partially or completely from its camera view due to the height and FOV limitations. The robot used its wheel speeds to compute the positions of the parking space with respect to its changing local frame as it moved along, based on its kinematic model. Lastly, the predicted entrance point of the parking space was used as the reference for the motion control of the robot until it was replaced by the actual center when it became visible again by the robot. The linear and angular velocities of the robot chassis center were computed based on the error between the current chassis center and the reference point. Then the left and right wheel speeds were obtained using inverse kinematics and sent to the motor driver. The above-mentioned four subtasks were all successfully accomplished, with the transformed learning, image processing, and target prediction performed in MATLAB, while the motion control and image capture conducted on a self-built small scale differential drive mobile robot. The small-scale robot employs a Raspberry Pi board, a Pi camera, an L298N dual H-bridge motor driver, a USB power module, a power bank, four wheels, and a chassis. Future research includes three areas: the integration of all four subsystems into one hardware/software platform with the upgrade to an Nvidia Jetson Nano board that provides superior performance for deep learning and image processing; more testing and validation on the identification of available parking space and its boundary lines; improvement of performance after the hardware/software integration is completed.Keywords: autonomous parking, convolutional neural network, image processing, kinematics-based prediction, transfer learning
Procedia PDF Downloads 13224068 The Relationship of Lean Management Principles with Lean Maturity Levels: Multiple Case Study in Manufacturing Companies
Authors: Alexandre D. Ferraz, Dario H. Alliprandini, Mauro Sampaio
Abstract:
Companies and other institutions are constantly seeking better organizational performance and greater competitiveness. In order to fulfill this purpose, there are many tools, methodologies and models for increasing performance. However, the Lean Management approach seems to be the most effective in terms of achieving a significant improvement in productivity relatively quickly. Although Lean tools are relatively easy to understand and implement in different contexts, many organizations are not able to transform themselves into 'Lean companies'. Most of the efforts in its implementation have shown single benefits, failing to achieve the desired impact on the performance of the overall enterprise system. There is also a growing perception of the importance of management in Lean transformation, but few studies have empirically investigated and described the 'Lean Management'. In order to understand more clearly the ideas that guide Lean Management and its influence on the maturity level of the production system, the objective of this research is analyze the relationship between the Lean Management principles and the Lean maturity level in the organizations. The research also analyzes the principles of Lean Management and its relationship with the 'Lean culture' and the results obtained. The research was developed using the case study methodology. Three manufacturing units of a German multinational company from industrial automation segment, located in different countries were studied, in order to have a better comparison between the practices and the level of maturity in the implementation. The primary source of information was the application of a research questionnaire based on the theoretical review. The research showed that higher the level of Lean Management principles, higher are the Lean maturity level, the Lean culture level, and the level of Lean results obtained in the organization. The research also showed that factors such as time for application of Lean concepts and company size were not determinant for the level of Lean Management principles and, consequently, for the level of Lean maturity in the organization. The characteristics of the production system showed much more influence in different evaluated aspects. The present research also left recommendations for the managers of the plants analyzed and suggestions for future research.Keywords: lean management, lean principles, lean maturity level, lean manufacturing
Procedia PDF Downloads 14324067 Comparison Of Data Mining Models To Predict Future Bridge Conditions
Authors: Pablo Martinez, Emad Mohamed, Osama Mohsen, Yasser Mohamed
Abstract:
Highway and bridge agencies, such as the Ministry of Transportation in Ontario, use the Bridge Condition Index (BCI) which is defined as the weighted condition of all bridge elements to determine the rehabilitation priorities for its bridges. Therefore, accurate forecasting of BCI is essential for bridge rehabilitation budgeting planning. The large amount of data available in regard to bridge conditions for several years dictate utilizing traditional mathematical models as infeasible analysis methods. This research study focuses on investigating different classification models that are developed to predict the bridge condition index in the province of Ontario, Canada based on the publicly available data for 2800 bridges over a period of more than 10 years. The data preparation is a key factor to develop acceptable classification models even with the simplest one, the k-NN model. All the models were tested, compared and statistically validated via cross validation and t-test. A simple k-NN model showed reasonable results (within 0.5% relative error) when predicting the bridge condition in an incoming year.Keywords: asset management, bridge condition index, data mining, forecasting, infrastructure, knowledge discovery in databases, maintenance, predictive models
Procedia PDF Downloads 19124066 Piql Preservation Services - A Holistic Approach to Digital Long-Term Preservation
Authors: Alexander Rych
Abstract:
Piql Preservation Services (“Piql”) is a turnkey solution designed for secure, migration-free long- term preservation of digital data. Piql sets an open standard for long- term preservation for the future. It consists of equipment and processes needed for writing and retrieving digital data. Exponentially growing amounts of data demand for logistically effective and cost effective processes. Digital storage media (hard disks, magnetic tape) exhibit limited lifetime. Repetitive data migration to overcome rapid obsolescence of hardware and software bears accelerated risk of data loss, data corruption or even manipulation and adds significant repetitive costs for hardware and software investments. Piql stores any kind of data in its digital as well as analog form securely for 500 years. The medium that provides this is a film reel. Using photosensitive film polyester base, a very stable material that is known for its immutability over hundreds of years, secure and cost-effective long- term preservation can be provided. The film reel itself is stored in a packaging capable of protecting the optical storage medium. These components have undergone extensive testing to ensure longevity of up to 500 years. In addition to its durability, film is a true WORM (write once- read many) medium. It therefore is resistant to editing or manipulation. Being able to store any form of data onto the film makes Piql a superior solution for long-term preservation. Paper documents, images, video or audio sequences – all of those file formats and documents can be preserved in its native file structure. In order to restore the encoded digital data, only a film scanner, a digital camera or any appropriate optical reading device will be needed in the future. Every film reel includes an index section describing the data saved on the film. It also contains a content section carrying meta-data, enabling users in the future to rebuild software in order to read and decode the digital information.Keywords: digital data, long-term preservation, migration-free, photosensitive film
Procedia PDF Downloads 39224065 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs
Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili
Abstract:
OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.Keywords: LWD measurements, caliper log, correlations, analysis
Procedia PDF Downloads 12124064 Inversion of Gravity Data for Density Reconstruction
Authors: Arka Roy, Chandra Prakash Dubey
Abstract:
Inverse problem generally used for recovering hidden information from outside available data. Vertical component of gravity field we will be going to use for underneath density structure calculation. Ill-posing nature is main obstacle for any inverse problem. Linear regularization using Tikhonov formulation are used for appropriate choice of SVD and GSVD components. For real time data handle, signal to noise ratios should have to be less for reliable solution. In our study, 2D and 3D synthetic model with rectangular grid are used for gravity field calculation and its corresponding inversion for density reconstruction. Fine grid also we have considered to hold any irregular structure. Keeping in mind of algebraic ambiguity factor number of observation point should be more than that of number of data point. Picard plot is represented here for choosing appropriate or main controlling Eigenvalues for a regularized solution. Another important study is depth resolution plot (DRP). DRP are generally used for studying how the inversion is influenced by regularizing or discretizing. Our further study involves real time gravity data inversion of Vredeforte Dome South Africa. We apply our method to this data. The results include density structure is in good agreement with known formation in that region, which puts an additional support of our method.Keywords: depth resolution plot, gravity inversion, Picard plot, SVD, Tikhonov formulation
Procedia PDF Downloads 21224063 DeepOmics: Deep Learning for Understanding Genome Functioning and the Underlying Genetic Causes of Disease
Authors: Vishnu Pratap Singh Kirar, Madhuri Saxena
Abstract:
Advancement in sequence data generation technologies is churning out voluminous omics data and posing a massive challenge to annotate the biological functional features. With so much data available, the use of machine learning methods and tools to make novel inferences has become obvious. Machine learning methods have been successfully applied to a lot of disciplines, including computational biology and bioinformatics. Researchers in computational biology are interested to develop novel machine learning frameworks to classify the huge amounts of biological data. In this proposal, it plan to employ novel machine learning approaches to aid the understanding of how apparently innocuous mutations (in intergenic DNA and at synonymous sites) cause diseases. We are also interested in discovering novel functional sites in the genome and mutations in which can affect a phenotype of interest.Keywords: genome wide association studies (GWAS), next generation sequencing (NGS), deep learning, omics
Procedia PDF Downloads 9724062 An Efficient Data Mining Technique for Online Stores
Authors: Mohammed Al-Shalabi, Alaa Obeidat
Abstract:
In any food stores, some items will be expired or destroyed because the demand on these items is infrequent, so we need a system that can help the decision maker to make an offer on such items to improve the demand on the items by putting them with some other frequent item and decrease the price to avoid losses. The system generates hundreds or thousands of patterns (offers) for each low demand item, then it uses the association rules (support, confidence) to find the interesting patterns (the best offer to achieve the lowest losses). In this paper, we propose a data mining method for determining the best offer by merging the data mining techniques with the e-commerce strategy. The task is to build a model to predict the best offer. The goal is to maximize the profits of a store and avoid the loss of products. The idea in this paper is the using of the association rules in marketing with a combination with e-commerce.Keywords: data mining, association rules, confidence, online stores
Procedia PDF Downloads 41024061 Elemental Graph Data Model: A Semantic and Topological Representation of Building Elements
Authors: Yasmeen A. S. Essawy, Khaled Nassar
Abstract:
With the rapid increase of complexity in the building industry, professionals in the A/E/C industry were forced to adopt Building Information Modeling (BIM) in order to enhance the communication between the different project stakeholders throughout the project life cycle and create a semantic object-oriented building model that can support geometric-topological analysis of building elements during design and construction. This paper presents a model that extracts topological relationships and geometrical properties of building elements from an existing fully designed BIM, and maps this information into a directed acyclic Elemental Graph Data Model (EGDM). The model incorporates BIM-based search algorithms for automatic deduction of geometrical data and topological relationships for each building element type. Using graph search algorithms, such as Depth First Search (DFS) and topological sortings, all possible construction sequences can be generated and compared against production and construction rules to generate an optimized construction sequence and its associated schedule. The model is implemented in a C# platform.Keywords: building information modeling (BIM), elemental graph data model (EGDM), geometric and topological data models, graph theory
Procedia PDF Downloads 38224060 Wireless Sensor Network for Forest Fire Detection and Localization
Authors: Tarek Dandashi
Abstract:
WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.Keywords: forest fire, WSN, wireless sensor network, algortihm
Procedia PDF Downloads 26224059 A Feasibility Study of Crowdsourcing Data Collection for Facility Maintenance Management
Authors: Mohamed Bin Alhaj, Hexu Liu, Mohammed Sulaiman, Osama Abudayyeh
Abstract:
An effective facility maintenance management (FMM) system plays a crucial role in improving the quality of services and maintaining the facility in good condition. Current FMM heavily relies on the quality of the data collection function of the FMM systems, at times resulting in inefficient FMM decision-making. The new technology-based crowdsourcing provides great potential to improve the current FMM practices, especially in terms of timeliness and quality of data. This research aims to investigate the feasibility of using new technology-driven crowdsourcing for FMM and highlight its opportunities and challenges. A survey was carried out to understand the human, data, system, geospatial, and automation characteristics of crowdsourcing for an educational campus FMM via social networks. The survey results were analyzed to reveal the challenges and recommendations for the implementation of crowdsourcing for FMM. This research contributes to the body of knowledge by synthesizing the challenges and opportunities of using crowdsourcing for facility maintenance and providing a road map for applying crowdsourcing technology in FMM. In future work, a conceptual framework will be proposed to support data-driven FMM using social networks.Keywords: crowdsourcing, facility maintenance management, social networks
Procedia PDF Downloads 17324058 Challenges and Opportunities: One Stop Processing for the Automation of Indonesian Large-Scale Topographic Base Map Using Airborne LiDAR Data
Authors: Elyta Widyaningrum
Abstract:
The LiDAR data acquisition has been recognizable as one of the fastest solution to provide the basis data for topographic base mapping in Indonesia. The challenges to accelerate the provision of large-scale topographic base maps as a development plan basis gives the opportunity to implement the automated scheme in the map production process. The one stop processing will also contribute to accelerate the map provision especially to conform with the Indonesian fundamental spatial data catalog derived from ISO 19110 and geospatial database integration. Thus, the automated LiDAR classification, DTM generation and feature extraction will be conducted in one GIS-software environment to form all layers of topographic base maps. The quality of automated topographic base map will be assessed and analyzed based on its completeness, correctness, contiguity, consistency and possible customization.Keywords: automation, GIS environment, LiDAR processing, map quality
Procedia PDF Downloads 36824057 Mixtures of Length-Biased Weibull Distributions for Loss Severity Modelling
Authors: Taehan Bae
Abstract:
In this paper, a class of length-biased Weibull mixtures is presented to model loss severity data. The proposed model generalizes the Erlang mixtures with the common scale parameter, and it shares many important modelling features, such as flexibility to fit various data distribution shapes and weak-denseness in the class of positive continuous distributions, with the Erlang mixtures. We show that the asymptotic tail estimate of the length-biased Weibull mixture is Weibull-type, which makes the model effective to fit loss severity data with heavy-tailed observations. A method of statistical estimation is discussed with applications on real catastrophic loss data sets.Keywords: Erlang mixture, length-biased distribution, transformed gamma distribution, asymptotic tail estimate, EM algorithm, expectation-maximization algorithm
Procedia PDF Downloads 22424056 Planning the Journey of Unifying Medical Record Numbers in Five Facilities and the Expected Challenges: Case Study in Saudi Arabia
Authors: N. Al Khashan, H. Al Shammari, W. Al Bahli
Abstract:
Patients who are eligible to receive treatment at the National Guard Health Affairs (NGHA), Saudi Arabia will typically have four medical record numbers (MRN), one in each of the geographical areas. More hospitals and primary healthcare facilities in other geographical areas will launch soon which means more MRNs. When patients own four MRNs, this will cause major drawbacks in patients’ quality of care such as creating new medical files in different regions for relocated patients and using referral system among regions. Consequently, the access to a patient’s medical record from other regions and the interoperability of health information between the four hospitals’ information system would be challenging. Thus, there is a need to unify medical records among these five facilities. As part of the effort to increase the quality of care, a new Hospital Information Systems (HIS) was implemented in all NGHA facilities by the end of 2016. NGHA’s plan is put to be aligned with the Saudi Arabian national transformation program 2020; whereby 70% citizens and residents of Saudi Arabia would have a unified medical record number that enables transactions between multiple Electronic Medical Records (EMRs) vendors. The aim of the study is to explore the plan, the challenges and barriers of unifying the 4 MRNs into one Enterprise Patient Identifier (EPI) in NGHA hospitals by December 2018. A descriptive study methodology was used. A journey map and a project plan are created to be followed by the project team to ensure a smooth implementation of the EPI. It includes the following: 1) Approved project charter, 2) Project management plan, 3) Change management plan, 4) Project milestone dates. Currently, the HIS is using the regional MRN. Therefore, the HIS and all integrated health care systems in all regions will need modification to move from MRN to EPI without interfering with patient care. For now, the NGHA have successfully implemented an EPI connected with the 4 MRNs that work in the back end in the systems’ database.Keywords: consumer health, health informatics, hospital information system, universal medical record number
Procedia PDF Downloads 19624055 Robust Data Image Watermarking for Data Security
Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan
Abstract:
In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms
Procedia PDF Downloads 51524054 An Empirical Investigation of Big Data Analytics: The Financial Performance of Users versus Vendors
Authors: Evisa Mitrou, Nicholas Tsitsianis, Supriya Shinde
Abstract:
In the age of digitisation and globalisation, businesses have shifted online and are investing in big data analytics (BDA) to respond to changing market conditions and sustain their performance. Our study shifts the focus from the adoption of BDA to the impact of BDA on financial performance. We explore the financial performance of both BDA-vendors (business-to-business) and BDA-clients (business-to-customer). We distinguish between the five BDA-technologies (big-data-as-a-service (BDaaS), descriptive, diagnostic, predictive, and prescriptive analytics) and discuss them individually. Further, we use four perspectives (internal business process, learning and growth, customer, and finance) and discuss the significance of how each of the five BDA-technologies affects the performance measures of these four perspectives. We also present the analysis of employee engagement, average turnover, average net income, and average net assets for BDA-clients and BDA-vendors. Our study also explores the effect of the COVID-19 pandemic on business continuity for both BDA-vendors and BDA-clients.Keywords: BDA-clients, BDA-vendors, big data analytics, financial performance
Procedia PDF Downloads 12424053 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data
Authors: Saeid Gharechelou, Ryutaro Tateishi
Abstract:
Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid damage monitoring, 2015-Nepal earthquake
Procedia PDF Downloads 17224052 Scheduling Nodes Activity and Data Communication for Target Tracking in Wireless Sensor Networks
Authors: AmirHossein Mohajerzadeh, Mohammad Alishahi, Saeed Aslishahi, Mohsen Zabihi
Abstract:
In this paper, we consider sensor nodes with the capability of measuring the bearings (relative angle to the target). We use geometric methods to select a set of observer nodes which are responsible for collecting data from the target. Considering the characteristics of target tracking applications, it is clear that significant numbers of sensor nodes are usually inactive. Therefore, in order to minimize the total network energy consumption, a set of sensor nodes, called sentinel, is periodically selected for monitoring, controlling the environment and transmitting data through the network. The other nodes are inactive. Furthermore, the proposed algorithm provides a joint scheduling and routing algorithm to transmit data between network nodes and the fusion center (FC) in which not only provides an efficient way to estimate the target position but also provides an efficient target tracking. Performance evaluation confirms the superiority of the proposed algorithm.Keywords: coverage, routing, scheduling, target tracking, wireless sensor networks
Procedia PDF Downloads 37824051 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 37424050 Blockchain Technology Security Evaluation: Voting System Based on Blockchain
Authors: Omid Amini
Abstract:
Nowadays, technology plays the most important role in the life of human beings because people use technology to share data and to communicate with each other, but the challenge is the security of this data. For instance, as more people turn to technology in the world, more data is generated, and more hackers try to steal or infiltrate data. In addition, the data is under the control of the central authority, which can trigger the challenge of losing information and changing information; this can create widespread anxiety for different people in different communities. In this paper, we sought to investigate Blockchain technology that can guarantee information security and eliminate the challenge of central authority access to information. Now a day, people are suffering from the current voting system. This means that the lack of transparency in the voting system is a big problem for society and the government in most countries, but blockchain technology can be the best alternative to the previous voting system methods because it removes the most important challenge for voting. According to the results, this research can be a good start to getting acquainted with this new technology, especially on the security part and familiarity with how to use a voting system based on blockchain in the world. At the end of this research, it is concluded that the use of blockchain technology can solve the major security problem and lead to a secure and transparent election.Keywords: blockchain, technology, security, information, voting system, transparency
Procedia PDF Downloads 13224049 Stakeholder Perception in the Role of Short-term Accommodations on the Place Brand and Real Estate Development of Urban Areas: A Case Study of Malate, Manila
Authors: Virgilio Angelo Gelera Gener
Abstract:
This study investigates the role of short-term accommodations on the place brand and real estate development of urban areas. It aims to know the perceptions of the general public, real estate developers, as well as city and barangay-level local government units (LGUs) on how these lodgings affect the place brand and land value of a community. It likewise attempts to identify the personal and institutional variables having a great influence on said perceptions in order to provide a better understanding of these establishments and their relevance within urban localities. Using certain sources, Malate, Manila was identified to be the ideal study area of the thesis. This prompted the employment of mixed methods research as the study’s fundamental data gathering and analytical tool. Here, a survey with 350 locals was done, asking them questions that would answer the aforementioned queries. Thereafter, a Pearson Chi-square Test and Multinomial Logistic Regression (MLR) were utilized to determine the variables affecting their perceptions. There were also Focus Group Discussions (FGDs) with the three (3) most populated Malate barangays, as well as Key Informant Interviews (KIIs) with selected city officials and fifteen (15) real estate company representatives. With that, survey results showed that although a 1992 Department of Tourism (DOT) Circular regards short-term accommodations as lodgings mainly for travelers, most people actually use it for their private/intimate moments. Because of this, the survey further revealed that short-term accommodations exhibit a negative place brand among the respondents though they also believe that it’s still one of society’s most important economic players. Statistics from the Pearson Chi-square Test, on the other hand, indicate that there are fourteen (14) out of seventeen (17) variables exhibiting great influence on respondents’ perceptions. Whereas MLR findings show that being born in Malate and being part of a family household was the most significant regardless of socio-economic level and monthly household income. For the city officials, it was revealed that said lodgings are actually the second-highest earners in the City’s lodging industry. It was further stated that their zoning ordinance treats short-term accommodations just like any other lodging enterprise. So it’s perfectly legal for these establishments to situate themselves near residential areas and/or institutional structures. A sit down with barangays, on the other hand, recognized the economic benefits of short-term accommodations but likewise admitted that it contributes a negative place brand to the community. Lastly, real estate developers are amenable to having their projects built near short-term accommodations, for they do not have any bad views against it. They explained that their projects sites have always been motivated by suitability, liability, and marketability factors only. Overall, these findings merit a recalibration of the zoning ordinance and DOT Circular, as well as the imposition of regulations on their sexually suggestive roadside advertisements. Then, once relevant measures are refined for proper implementation, it can also pave the way for spatial interventions (like visual buffer corridors) to better address the needs of the locals, private groups, and government.Keywords: estate planning, place brand, real estate development, short-term accommodations
Procedia PDF Downloads 165