Search results for: real world driving data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32529

Search results for: real world driving data

31659 The Impact of Behavioral Factors on the Decision Making of Real Estate Investor of Pakistan

Authors: Khalid Bashir, Hammad Zahid

Abstract:

Most of the investors consider that economic and financial information is the most important at the time of making investment decisions. But it is not true, as in the past two decades, the Behavioral aspects and the behavioral biases have gained an important place in the decision-making process of an investor. This study is basically conducted on this fact. The purpose of this study is to examine the impact of behavioral factors on the decision-making of the individual real estate investor in Pakistan. Some important behavioral factors like overconfidence, anchoring, gambler’s fallacy, home bias, loss aversion, regret aversion, mental accounting, herding and representativeness are used in this study to find their impact on the psychology of individual investors. The targeted population is the real estate investor of Pakistan, and a sample of 650 investors is selected on the basis of convenience sampling technique. The data is collected through the questionnaire with a response rate of 46.15 %. Descriptive statistical techniques and SEM are used to analyze the data by using statistical software. The results revealed the fact that some behavioral factors have a significant impact on the decision-making of investors. Among all the behavioral biases, overconfidence, anchoring, gambler’s fallacy, loss aversion and representativeness have a significant positive impact on the decision-making of the individual investor, while the rest of biases like home bias, regret aversion, mental accounting, herding have less impact on the decision-making process of an individual.

Keywords: behavioral finance, anchoring, gambler’s fallacy, loss aversion

Procedia PDF Downloads 51
31658 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering

Authors: K. Umbleja, M. Ichino

Abstract:

Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.

Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis

Procedia PDF Downloads 146
31657 Real-Time Kinetic Analysis of Labor-Intensive Repetitive Tasks Using Depth-Sensing Camera

Authors: Sudip Subedi, Nipesh Pradhananga

Abstract:

The musculoskeletal disorders, also known as MSDs, are common in construction workers. MSDs include lower back injuries, knee injuries, spinal injuries, and joint injuries, among others. Since most construction tasks are still manual, construction workers often need to perform repetitive, labor-intensive tasks. And they need to stay in the same or an awkward posture for an extended time while performing such tasks. It induces significant stress to the joints and spines, increasing the risk of getting into MSDs. Manual monitoring of such tasks is virtually impossible with the handful of safety managers in a construction site. This paper proposes a methodology for performing kinetic analysis of the working postures while performing such tasks in real-time. Skeletal of different workers will be tracked using a depth-sensing camera while performing the task to create training data for identifying the best posture. For this, the kinetic analysis will be performed using a human musculoskeletal model in an open-source software system (OpenSim) to visualize the stress induced by essential joints. The “safe posture” inducing lowest stress on essential joints will be computed for different actions involved in the task. The identified “safe posture” will serve as a basis for real-time monitoring and identification of awkward and unsafe postural behaviors of construction workers. Besides, the temporal simulation will be carried out to find the associated long-term effect of repetitive exposure to such observed postures. This will help to create awareness in workers about potential future health hazards and encourage them to work safely. Furthermore, the collected individual data can then be used to provide need-based personalized training to the construction workers.

Keywords: construction workers’ safety, depth sensing camera, human body kinetics, musculoskeletal disorders, real time monitoring, repetitive labor-intensive tasks

Procedia PDF Downloads 111
31656 Triplex Detection of Pistacia vera, Arachis hypogaea and Pisum sativum in Processed Food Products Using Probe Based PCR

Authors: Ergün Şakalar, Şeyma Özçirak Ergün, Emrah Yalazi̇, Emine Altinkaya, Cengiz Ataşoğlu

Abstract:

In recent years, food allergies which cause serious health problems affect to public health around the world. Foodstuffs which contain allergens are either intentionally used as ingredients or are encased as contaminant in food products. The prevalence of clinical allergy to peanuts and nuts is estimated at about 0.4%-1.1% of the adult population, representing the allergy to pistachio the 7% of the cases of tree nut causing allergic reactions. In order to protect public health and enforce the legislation, methods for sensitive analysis of pistachio and peanut contents in food are required. Pea, pistachio and peanut are used together, to reduce the cost in food production such as baklava, snack foods.DNA technology-based methods in food analysis are well-established and well-roundedtools for species differentiation, allergen detection. Especially, the probe-based TaqMan real-time PCR assay can amplify target DNA with efficiency, specificity, and sensitivity.In this study, pistachio, peanut and pea were finely ground and three separate series of triplet mixtures containing 0.1, 1, 10, 100, 1000, 10,000 and 100,000 mg kg-1 of each sample were prepared for each series, to a final weight of 100 g. DNA from reference samples and industrial products was successfully extracted with the GIDAGEN® Multi-Fast DNA Isolation Kit. TaqMan probes were designed for triplex determination of ITS, Ara h 3 and pea lectin genes which are specific regions for identification pistachio, peanut and pea, respectively.The real-time PCR as quantitative detected pistachio, peanut and pea in these mixtures down to the lowest investigated level of 0.1, 0.1 and 1 mg kg-1, respectively. Also, the methods reported here are capable of detecting of as little as 0.001% level of peanut DNA, 0,000001% level of pistachio DNA and 0.000001% level of pea DNA. We accomplish that the quantitative triplex real-time PCR method developed in this study canbe applied to detect pistachio, peanut and peatraces for three allergens at once in commercial food products.

Keywords: allergens, DNA, real-time PCR, TaqMan probe

Procedia PDF Downloads 236
31655 Re-Thinking Design/Build Curriculum in a Virtual World

Authors: Bruce Wrightsman

Abstract:

Traditionally, in architectural education, we develop studio projects with learning agendas that try to minimize conflict and reveal clear design objectives. Knowledge is gleaned only tacitly through confronting the reciprocity of site and form, space and light, structure and envelope. This institutional reality can limit student learning to the latent learning opportunities they will have to confront later in practice. One intent of academic design-build projects is to address the learning opportunities which one can discover in the messy grey areas of design. In this immersive experience, students confront the limitations of classroom learning and are exposed to challenges that demand collaborative practice. As a result, design-build has been widely adopted in an attempt to address perceived deficiencies in design education vis a vis the integration of building technology and construction. Hands-on learning is not a new topic, as espoused by John Dewey, who posits a debate between static and active learning in his book Democracy and Education. Dewey espouses the concept that individuals should become participants and not mere observers of what happens around them. Advocates of academic design-build programs suggest a direct link between Dewey’s speculation. These experiences provide irreplaceable life lessons: that real-world decisions have real-life consequences. The goal of the paper is not to confirm or refute the legitimacy and efficacy of online virtual learning. Rather, the paper aims to foster a deeper, honest discourse on the meaning of ‘making’ in architectural education and present projects that confronted the burdens of a global pandemic and developed unique teaching strategies that challenged design thinking as an observational and constructive effort to expand design student’s making skills and foster student agency.

Keywords: design/build, making, remote teaching, architectural curriculum

Procedia PDF Downloads 61
31654 Helping the Development of Public Policies with Knowledge of Criminal Data

Authors: Diego De Castro Rodrigues, Marcelo B. Nery, Sergio Adorno

Abstract:

The project aims to develop a framework for social data analysis, particularly by mobilizing criminal records and applying descriptive computational techniques, such as associative algorithms and extraction of tree decision rules, among others. The methods and instruments discussed in this work will enable the discovery of patterns, providing a guided means to identify similarities between recurring situations in the social sphere using descriptive techniques and data visualization. The study area has been defined as the city of São Paulo, with the structuring of social data as the central idea, with a particular focus on the quality of the information. Given this, a set of tools will be validated, including the use of a database and tools for visualizing the results. Among the main deliverables related to products and the development of articles are the discoveries made during the research phase. The effectiveness and utility of the results will depend on studies involving real data, validated both by domain experts and by identifying and comparing the patterns found in this study with other phenomena described in the literature. The intention is to contribute to evidence-based understanding and decision-making in the social field.

Keywords: social data analysis, criminal records, computational techniques, data mining, big data

Procedia PDF Downloads 63
31653 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act

Authors: Maria Jędrzejczak, Patryk Pieniążek

Abstract:

The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.

Keywords: data protection law, personal data, AI law, personal data breach

Procedia PDF Downloads 42
31652 Adaptive Nonparametric Approach for Guaranteed Real-Time Detection of Targeted Signals in Multichannel Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

An adaptive nonparametric method is proposed for stable real-time detection of seismoacoustic sources in multichannel C-OTDR systems with a significant number of channels. This method guarantees given upper boundaries for probabilities of Type I and Type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDR-system are presented in this report.

Keywords: guaranteed detection, multichannel monitoring systems, change point, interval estimation, adaptive detection

Procedia PDF Downloads 431
31651 Globalization and Women's Social Identity in Iran: A Case Study of Educated Women in the 'World City' of Yazd

Authors: Mohammad Tefagh

Abstract:

The process of globalization has transformed many social and cultural phenomena and has entered the world into a new era and arena. This phenomenon has introduced new methods, ideas, and identity interactions to human beings and has caused great changes in individual and social identity. Women have also been affected by globalization. Globalization has made the presence of women more and more effective and has caused identity changes and changes in the dimensions of identity in them. The purpose of this study is to investigate the impact of globalization of culture on changes in the social identity of educated women in the global city of Yazd. This study will discuss identity change and identity reconstruction due to globalization. The method of this study is qualitative, and the research data is obtained through in-depth interviews with 15 Yazdi-educated women at the Ph.D. level. The method of data analysis is thematic analysis. Findings of the research show that educated Yazdi women have changed their identity due to new communication processes and globalization, including faster, easier, and cheaper communication with other women in the world near and far. Women's social identity has also changed in the face of elements of globalization in various dimensions such as national, gender, religious, and group identities. The analysis of the interviews revealed the confronting elements such as using new cultural goods and communication technologies, membership in social networks, and increasing awareness of environmental change.

Keywords: globalization, social identity, educated women, Yazd

Procedia PDF Downloads 313
31650 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: collision, impact models, finite element method, Hertz Theory

Procedia PDF Downloads 157
31649 Online Public Transport Safety Awareness System

Authors: Danny Mwangi, Collins Oduor Ondiek

Abstract:

Mass mobility is one of the most important characteristics of every industrialized civilization. Man must travel about in order to fulfill his commitment to putting food on his table. As a result, movement is an important part of human life. Man must travel from one place to another. This is a natural trait of humans, according to elementary science. Variables in human mobility have arisen as a result of technological advancements over time. Public transit is one of these modes of transportation. When it comes to reducing safety-related risks in the public transport system, awareness is crucial. So much so even when it comes to public transportation in Kenya. Having a system that can be able to keep users updated with real-time traffic updates on the route, they are on and also have the ability to rate drivers after a trip could go a long way in improving safety on Kenyan roads. What this proposed system is intended to accomplish is to reduce occurrences of reckless driving and give matatu drivers the feeling that they are accountable to someone and more so have the incentive to be better drivers who are motivated to follow the law and have passenger safety as a priority. The research was conducted, and the findings show that 95.2% of respondents were not satisfied with the current safety measures in the Kenyan public transport sector. This means that the chances for this system to be accepted in the market are high because it addresses a key issue. 98.8% of the respondents were of the opinion that the implementation of the proposed system would significantly increase safety measures in the public transport sector. During the research, it was clear that the main challenge 77.1% of the respondents face when using public transport is that there is no way to monitor driver safety performance, and 68.7% of the respondent believed the widespread use of unroadworthy public transit vehicles contributed to the lack of safety when using public transport. However, 77.1% of the respondents expect the benefit of creating a sense of accountability for the drivers, and 74.7% of the respondents expect the benefit of increased passenger safety. 63.9% believe that with the implementation of the system, there will be the benefit of monitoring driver performance. This shows that with the implementation of the proposed system, it will be possible to make a lot of progress in terms of making Kenyan roads safer when using public transit. According to the findings, it is recommended that this proposed public transportation safety awareness system be implemented as it will be able to address matatu passengers' safety concerns while also encouraging matatu drivers to drive more carefully. As a result, it's a project with a chance of becoming viable, marketable, and feasible.

Keywords: public safety, public transportation, accountable driving, safe transportation

Procedia PDF Downloads 82
31648 Unified Coordinate System Approach for Swarm Search Algorithms in Global Information Deficit Environments

Authors: Rohit Dey, Sailendra Karra

Abstract:

This paper aims at solving the problem of multi-target searching in a Global Positioning System (GPS) denied environment using swarm robots with limited sensing and communication abilities. Typically, existing swarm-based search algorithms rely on the presence of a global coordinate system (vis-à-vis, GPS) that is shared by the entire swarm which, in turn, limits its application in a real-world scenario. This can be attributed to the fact that robots in a swarm need to share information among themselves regarding their location and signal from targets to decide their future course of action but this information is only meaningful when they all share the same coordinate frame. The paper addresses this very issue by eliminating any dependency of a search algorithm on the need of a predetermined global coordinate frame by the unification of the relative coordinate of individual robots when within the communication range, therefore, making the system more robust in real scenarios. Our algorithm assumes that all the robots in the swarm are equipped with range and bearing sensors and have limited sensing range and communication abilities. Initially, every robot maintains their relative coordinate frame and follow Levy walk random exploration until they come in range with other robots. When two or more robots are within communication range, they share sensor information and their location w.r.t. their coordinate frames based on which we unify their coordinate frames. Now they can share information about the areas that were already explored, information about the surroundings, and target signal from their location to make decisions about their future movement based on the search algorithm. During the process of exploration, there can be several small groups of robots having their own coordinate systems but eventually, it is expected for all the robots to be under one global coordinate frame where they can communicate information on the exploration area following swarm search techniques. Using the proposed method, swarm-based search algorithms can work in a real-world scenario without GPS and any initial information about the size and shape of the environment. Initial simulation results show that running our modified-Particle Swarm Optimization (PSO) without global information we can still achieve the desired results that are comparable to basic PSO working with GPS. In the full paper, we plan on doing the comparison study between different strategies to unify the coordinate system and to implement them on other bio-inspired algorithms, to work in GPS denied environment.

Keywords: bio-inspired search algorithms, decentralized control, GPS denied environment, swarm robotics, target searching, unifying coordinate systems

Procedia PDF Downloads 119
31647 Speedup Breadth-First Search by Graph Ordering

Authors: Qiuyi Lyu, Bin Gong

Abstract:

Breadth-First Search(BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improve the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads. We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.

Keywords: breadth-first search, BFS, graph ordering, graph algorithm

Procedia PDF Downloads 119
31646 Analysis of Business Intelligence Tools in Healthcare

Authors: Avishkar Gawade, Omkar Bansode, Ketan Bhambure, Bhargav Deore

Abstract:

In recent year wide range of business intelligence technology have been applied to different area in order to support decision making process BI enables extraction of knowledge from data store. BI tools usually used in public health field for financial and administrative purposes.BI uses a dashboard in presentation stage to deliver information to information to end users.In this paper,we intend to analyze some open source BI tools on the market and their applicability in the clinical sphere taking into consideration the general characteristics of the clinical environment.A pervasive BI platform was developed using a real case in order to prove the tool viability.Analysis of various BI Tools in done with the help of several parameters such as data security,data integration,data quality reporting and anlaytics,performance,scalability and cost effectivesness.

Keywords: CDSS, EHR, business intelliegence, tools

Procedia PDF Downloads 122
31645 Row Detection and Graph-Based Localization in Tree Nurseries Using a 3D LiDAR

Authors: Ionut Vintu, Stefan Laible, Ruth Schulz

Abstract:

Agricultural robotics has been developing steadily over recent years, with the goal of reducing and even eliminating pesticides used in crops and to increase productivity by taking over human labor. The majority of crops are arranged in rows. The first step towards autonomous robots, capable of driving in fields and performing crop-handling tasks, is for robots to robustly detect the rows of plants. Recent work done towards autonomous driving between plant rows offers big robotic platforms equipped with various expensive sensors as a solution to this problem. These platforms need to be driven over the rows of plants. This approach lacks flexibility and scalability when it comes to the height of plants or distance between rows. This paper proposes instead an algorithm that makes use of cheaper sensors and has a higher variability. The main application is in tree nurseries. Here, plant height can range from a few centimeters to a few meters. Moreover, trees are often removed, leading to gaps within the plant rows. The core idea is to combine row detection algorithms with graph-based localization methods as they are used in SLAM. Nodes in the graph represent the estimated pose of the robot, and the edges embed constraints between these poses or between the robot and certain landmarks. This setup aims to improve individual plant detection and deal with exception handling, like row gaps, which are falsely detected as an end of rows. Four methods were developed for detecting row structures in the fields, all using a point cloud acquired with a 3D LiDAR as an input. Comparing the field coverage and number of damaged plants, the method that uses a local map around the robot proved to perform the best, with 68% covered rows and 25% damaged plants. This method is further used and combined with a graph-based localization algorithm, which uses the local map features to estimate the robot’s position inside the greater field. Testing the upgraded algorithm in a variety of simulated fields shows that the additional information obtained from localization provides a boost in performance over methods that rely purely on perception to navigate. The final algorithm achieved a row coverage of 80% and an accuracy of 27% damaged plants. Future work would focus on achieving a perfect score of 100% covered rows and 0% damaged plants. The main challenges that the algorithm needs to overcome are fields where the height of the plants is too small for the plants to be detected and fields where it is hard to distinguish between individual plants when they are overlapping. The method was also tested on a real robot in a small field with artificial plants. The tests were performed using a small robot platform equipped with wheel encoders, an IMU and an FX10 3D LiDAR. Over ten runs, the system achieved 100% coverage and 0% damaged plants. The framework built within the scope of this work can be further used to integrate data from additional sensors, with the goal of achieving even better results.

Keywords: 3D LiDAR, agricultural robots, graph-based localization, row detection

Procedia PDF Downloads 122
31644 Numerical Investigation of Turbulent Inflow Strategy in Wind Energy Applications

Authors: Arijit Saha, Hassan Kassem, Leo Hoening

Abstract:

Ongoing climate change demands the increasing use of renewable energies. Wind energy plays an important role in this context since it can be applied almost everywhere in the world. To reduce the costs of wind turbines and to make them more competitive, simulations are very important since experiments are often too costly if at all possible. The wind turbine on a vast open area experiences the turbulence generated due to the atmosphere, so it was of utmost interest from this research point of view to generate the turbulence through various Inlet Turbulence Generation methods like Precursor cyclic and Kaimal Spectrum Exponential Coherence (KSEC) in the computational simulation domain. To be able to validate computational fluid dynamic simulations of wind turbines with the experimental data, it is crucial to set up the conditions in the simulation as close to reality as possible. This present work, therefore, aims at investigating the turbulent inflow strategy and boundary conditions of KSEC and providing a comparative analysis alongside the Precursor cyclic method for Large Eddy Simulation within the context of wind energy applications. For the generation of the turbulent box through KSEC method, firstly, the constrained data were collected from an auxiliary channel flow, and later processing was performed with the open-source tool PyconTurb, whereas for the precursor cyclic, only the data from the auxiliary channel were sufficient. The functionality of these methods was studied through various statistical properties such as variance, turbulent intensity, etc with respect to different Bulk Reynolds numbers, and a conclusion was drawn on the feasibility of KSEC method. Furthermore, it was found necessary to verify the obtained data with DNS case setup for its applicability to use it as a real field CFD simulation.

Keywords: Inlet Turbulence Generation, CFD, precursor cyclic, KSEC, large Eddy simulation, PyconTurb

Procedia PDF Downloads 75
31643 Remote Sensing of Aerated Flows at Large Dams: Proof of Concept

Authors: Ahmed El Naggar, Homyan Saleh

Abstract:

Dams are crucial for flood control, water supply, and the creation of hydroelectric power. Every dam has a water conveyance system, such as a spillway, providing the safe discharge of catastrophic floods when necessary. Spillway design has historically been investigated in laboratory research owing to the absence of suitable full-scale flow monitoring equipment and safety problems. Prototype measurements of aerated flows are urgently needed to quantify projected scale effects and provide missing validation data for design guidelines and numerical simulations. In this work, an image-based investigation of free-surface flows on a tiered spillway was undertaken at the laboratory (fixed camera installation) and prototype size (drone video) (drone footage) (drone footage). The drone videos were generated using data from citizen science. Analyses permitted the measurement of the free-surface aeration inception point, air-water surface velocities, fluctuations, and residual energy at the chute's downstream end from a remote site. The prototype observations offered full-scale proof of concept, while laboratory results were efficiently confirmed against invasive phase-detection probe data. This paper stresses the efficacy of image-based analyses at prototype spillways. It highlights how citizen science data may enable academics better understand real-world air-water flow dynamics and offers a framework for a small collection of long-missing prototype data.

Keywords: remote sensing, aerated flows, large dams, proof of concept, dam spillways, air-water flows, prototype operation, remote sensing, inception point, optical flow, turbulence, residual energy

Procedia PDF Downloads 70
31642 An Epidemiological Analysis of the Occurrence of Bovine Brucellosis and Adopted Control Measures in South Africa during the Period 2014 to 2019

Authors: Emily Simango, T. Chitura

Abstract:

Background: Bovine brucellosis is among the most neglected zoonotic diseases in developing countries, where it is endemic and a growing challenge to public health. The development of cost-effective control measures for the disease can only be affirmed by the knowledge of the disease epidemiology and the ability to define its risk profiles. The aim of the study was to document the trend of bovine brucellosis and the control measures adopted following reported cases during the period 2014 to 2019 in South Africa. Methods: Data on confirmed cases of bovine brucellosis was retrieved from the website of the World Organisation of Animal Health (WOAH). Data was analysed using the Statistical Package for Social Sciences (IBM SPSS, 2022) version 29.0. Descriptive analysis (frequencies and percentages) and the Analysis of variance (ANOVA) were utilized for statistical significance (p<0.05). Results: The data retrieved in our study revealed an overall average bovine brucellosis prevalence of 8.48. There were statistically significant differences in bovine brucellosis prevalence across the provinces for the years 2016 and 2019 (p≥0.05), with the Eastern Cape Province having the highest prevalence in both instances. Documented control measures for the disease were limited to killing and disposal of disease cases as well as vaccination of susceptible animals. Conclusion: Bovine brucellosis is real in South Africa, with the risk profiles differing across the provinces. Information on brucellosis control measures in South Africa, as reported to the WOAH, is not comprehensive.

Keywords: zoonotic, endemic, Eastern Cape province, vaccination

Procedia PDF Downloads 46
31641 Analysis of Municipal Solid Waste Management in Nigeria

Authors: Anisa Gumel

Abstract:

This study examines the present condition of solid waste management in Nigeria. The author explores the challenges and opportunities affecting municipal solid waste management in "Nigeria" and determines the most profound challenges by analysing the interdependence and interrelationship among identified variables. In this study, multiple stakeholders, including 15 waste management professionals interviewed online, were utilised to identify the difficulties and opportunities affecting municipal solid waste in Nigeria. The interviews were transcribed and coded using NVivo to produce pertinent variables. An online survey of Nigerian internet and social media users was done to validate statements made by experts on the identified variable. In addition, a panel of five experts participated in a focus group discussion to discover the most influential factors that influence municipal solid waste management in Nigeria by analysing the interrelationships as well as the driving and reliant power of variables. The results show significant factors affecting municipal solid waste in Nigeria, including inadequate funding, lack of knowledge, and absence of legislation, as well as behavioural, financial, technological, and legal concerns grouped into five categories. Some claims stated by experts in the interview are supported by the survey data, while others are not. In addition, the focus group reveals patterns, correlations, and driving forces between variables that have been analysed. This study will provide decision-makers with a roadmap for resolving important waste management concerns in Nigeria and managing scarce resources effectively. It will also help non-governmental organisations combat malaria in Nigeria and other underdeveloped nations. In addition, the work contributes to the literature for future scholars to consult.

Keywords: municipal solid waste, stakeholders, public, experts

Procedia PDF Downloads 60
31640 Understanding Learning Styles of Hong Kong Tertiary Students for Engineering Education

Authors: K. M. Wong

Abstract:

Engineering education is crucial to technological innovation and advancement worldwide by generating young talents who are able to integrate scientific principles and design practical solutions for real-world problems. Graduates of engineering curriculums are expected to demonstrate an extensive set of learning outcomes as required in international accreditation agreements for engineering academic qualifications, such as the Washington Accord and the Sydney Accord. On the other hand, students have different learning preferences of receiving, processing and internalizing knowledge and skills. If the learning environment is advantageous to the learning styles of the students, there is a higher chance that the students can achieve the intended learning outcomes. With proper identification of the learning styles of the students, corresponding teaching strategies can then be developed for more effective learning. This research was an investigation of learning styles of tertiary students studying higher diploma programmes in Hong Kong. Data from over 200 students in engineering programmes were collected and analysed to identify the learning characteristics of students. A small-scale longitudinal study was then started to gather academic results of the students throughout their two-year engineering studies. Preliminary results suggested that the sample students were reflective, sensing, visual, and sequential learners. Observations from the analysed data not only provided valuable information for teachers to design more effective teaching strategies, but also provided data for further analysis with the students’ academic results. The results generated from the longitudinal study shed light on areas of improvement for more effective engineering curriculum design for better teaching and learning.

Keywords: learning styles, learning characteristics, engineering education, vocational education, Hong Kong

Procedia PDF Downloads 251
31639 Filtering Intrusion Detection Alarms Using Ant Clustering Approach

Authors: Ghodhbani Salah, Jemili Farah

Abstract:

With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.

Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms

Procedia PDF Downloads 390
31638 Investigating the Vehicle-Bicyclists Conflicts using LIDAR Sensor Technology at Signalized Intersections

Authors: Alireza Ansariyar, Mansoureh Jeihani

Abstract:

Light Detection and Ranging (LiDAR) sensors are capable of recording traffic data including the number of passing vehicles and bicyclists, the speed of vehicles and bicyclists, and the number of conflicts among both road users. In order to collect real-time traffic data and investigate the safety of different road users, a LiDAR sensor was installed at Cold Spring Ln – Hillen Rd intersection in Baltimore City. The frequency and severity of collected real-time conflicts were analyzed and the results highlighted that 122 conflicts were recorded over a 10-month time interval from May 2022 to February 2023. By using an innovative image-processing algorithm, a new safety Measure of Effectiveness (MOE) was proposed to recognize the critical zones for bicyclists entering each zone. Considering the trajectory of conflicts, the results of the analysis demonstrated that conflicts in the northern approach (zone N) are more frequent and severe. Additionally, sunny weather is more likely to cause severe vehicle-bike conflicts.

Keywords: LiDAR sensor, post encroachment time threshold (PET), vehicle-bike conflicts, a measure of effectiveness (MOE), weather condition

Procedia PDF Downloads 206
31637 An Agent-Based Model of Innovation Diffusion Using Heterogeneous Social Interaction and Preference

Authors: Jang kyun Cho, Jeong-dong Lee

Abstract:

The advent of the Internet, mobile communications, and social network services has stimulated social interactions among consumers, allowing people to affect one another’s innovation adoptions by exchanging information more frequently and more quickly. Previous diffusion models, such as the Bass model, however, face limitations in reflecting such recent phenomena in society. These models are weak in their ability to model interactions between agents; they model aggregated-level behaviors only. The agent based model, which is an alternative to the aggregate model, is good for individual modeling, but it is still not based on an economic perspective of social interactions so far. This study assumes the presence of social utility from other consumers in the adoption of innovation and investigates the effect of individual interactions on innovation diffusion by developing a new model called the interaction-based diffusion model. By comparing this model with previous diffusion models, the study also examines how the proposed model explains innovation diffusion from the perspective of economics. In addition, the study recommends the use of a small-world network topology instead of cellular automata to describe innovation diffusion. This study develops a model based on individual preference and heterogeneous social interactions using utility specification, which is expandable and, thus, able to encompass various issues in diffusion research, such as reservation price. Furthermore, the study proposes a new framework to forecast aggregated-level market demand from individual level modeling. The model also exhibits a good fit to real market data. It is expected that the study will contribute to our understanding of the innovation diffusion process through its microeconomic theoretical approach.

Keywords: innovation diffusion, agent based model, small-world network, demand forecasting

Procedia PDF Downloads 322
31636 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 220
31635 Exponential Value and Learning Effects in VR-Cutting-Vegetable Training

Authors: Jon-Chao Hong, Tsai-Ru Fan, Shih-Min Hsu

Abstract:

Virtual reality (VR) can generate mirror neurons that facilitate learners to transfer virtual skills to a real environment in skill training, and most studies approved the positive effect of applying in many domains. However, rare studies have focused on the experiential values of participants from a gender perspective. To address this issue, the present study used a VR program named kitchen assistant training, focusing on cutting vegetables and invited 400 students to practice for 20 minutes. Useful data from 367 were subjected to statistical analysis. The results indicated that male participants. From the comparison of average, it seems that females perceived higher than males in learning effectiveness. Expectedly, the VR-Cutting vegetables can be used for pre-training of real vegetable cutting.

Keywords: exponential value, facilitate learning, gender difference, virtual reality

Procedia PDF Downloads 78
31634 Virtual Reality Technology for Employee Training in High-Risk Industries: Benefits and Advancements

Authors: Yeganeh Jabbari, Sepideh Khalatabad

Abstract:

This study explores the development of virtual reality (VR) technology for training applications, specifically its the potential benefits of VR technology for employee training and its ability to simulate real-world scenarios in a safe and controlled environment are highlighted, along with the associated cost and time savings. The adoption of VR technology in high-risk industrial organizations such as the oil and gas industry is discussed, with a focus on its ability to improve worker performance. Additionally, the use of VR technology in activities such as simulation and data visualization in the oil and gas industry is explored, leading to enhanced safety measures and collaboration between teams. The integration of advanced technologies such as robotics is mentioned as a way to further promote efficiency and sustainability. Also, the study mentions that the digital transformation of the oil and gas industry is revolutionizing operations and promoting safety, efficiency, and sustainability through the use of VR technology.

Keywords: virtual reality training, virtual reality benefits, high-risk industries, digital transformation

Procedia PDF Downloads 71
31633 Construction Innovation: Support for 3D Printing House

Authors: Andrea Palazzo, Daniel Macek, Veronika Malinova

Abstract:

Contour processing is the new technology challenge for architects and construction companies. The many advantages it promises make it one of the most interesting solutions for construction in terms of automation of building processes. The technology for 3D printing houses offers many application possibilities, from low-cost construction, to being considered by NASA for visionary projects as a good solution for building settlements on other planets. Another very important point is that clients, as architects, will no longer have many limits in design concerning ideas and creativity. The prices for real estate are constantly increasing and the lack of availability of construction materials as well as the speculation that has been created around it in 2021 is bringing prices to such a level that in the future real estate developers risk not being able to find customers for these ultra-expensive homes. Hence, this paper starts with the introduction of 3D printing, which now has the potential to gain an important position in the market, becoming a valid alternative to the classic construction process. This technology is not only beneficial from an economic point of view but it is also a great opportunity to have an impact on the environment by reducing CO2 emissions. Further on in the article we will also understand if, after the COP 26 (2021 United Nations Climate Change Conference), world governments could also push towards building technologies that reduce the waste materials that are needed to be disposed of and at the same time reduce emissions with the contribution of governmental funds. This paper will give us insight on the multiple benefits of 3D printing and emphasise the importance of finding new solutions for materials that can be used by the printer. Therefore, based on the type of material, it will be possible to understand the compatibility with current regulations and how the authorities will be inclined to support this technology. This will help to enable the rise and development of this technology in Europe and in the rest of the world on actual housing projects and not only on prototypes.

Keywords: additive manufacturing, contour crafting, development, new regulation, printing material

Procedia PDF Downloads 176
31632 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 46
31631 Mining Scientific Literature to Discover Potential Research Data Sources: An Exploratory Study in the Field of Haemato-Oncology

Authors: A. Anastasiou, K. S. Tingay

Abstract:

Background: Discovering suitable datasets is an important part of health research, particularly for projects working with clinical data from patients organized in cohorts (cohort data), but with the proliferation of so many national and international initiatives, it is becoming increasingly difficult for research teams to locate real world datasets that are most relevant to their project objectives. We present a method for identifying healthcare institutes in the European Union (EU) which may hold haemato-oncology (HO) data. A key enabler of this research was the bibInsight platform, a scientometric data management and analysis system developed by the authors at Swansea University. Method: A PubMed search was conducted using HO clinical terms taken from previous work. The resulting XML file was processed using the bibInsight platform, linking affiliations to the Global Research Identifier Database (GRID). GRID is an international, standardized list of institutions, including the city and country in which the institution exists, as well as a category of the main business type, e.g., Academic, Healthcare, Government, Company. Countries were limited to the 28 current EU members, and institute type to 'Healthcare'. An article was considered valid if at least one author was affiliated with an EU-based healthcare institute. Results: The PubMed search produced 21,310 articles, consisting of 9,885 distinct affiliations with correspondence in GRID. Of these articles, 760 were from EU countries, and 390 of these were healthcare institutes. One affiliation was excluded as being a veterinary hospital. Two EU countries did not have any publications in our analysis dataset. The results were analysed by country and by individual healthcare institute. Networks both within the EU and internationally show institutional collaborations, which may suggest a willingness to share data for research purposes. Geographical mapping can ensure that data has broad population coverage. Collaborations with industry or government may exclude healthcare institutes that may have embargos or additional costs associated with data access. Conclusions: Data reuse is becoming increasingly important both for ensuring the validity of results, and economy of available resources. The ability to identify potential, specific data sources from over twenty thousand articles in less than an hour could assist in improving knowledge of, and access to, data sources. As our method has not yet specified if these healthcare institutes are holding data, or merely publishing on that topic, future work will involve text mining of data-specific concordant terms to identify numbers of participants, demographics, study methodologies, and sub-topics of interest.

Keywords: data reuse, data discovery, data linkage, journal articles, text mining

Procedia PDF Downloads 101
31630 Optimal Scheduling of Load and Operational Strategy of a Load Aggregator to Maximize Profit with PEVs

Authors: Md. Shafiullah, Ali T. Al-Awami

Abstract:

This project proposes optimal scheduling of imported power of a load aggregator with the utilization of EVs to maximize its profit. As with the increase of renewable energy resources, electricity price in competitive market becomes more uncertain and, on the other hand, with the penetration of renewable distributed generators in the distribution network the predicted load of a load aggregator also becomes uncertain in real time. Though there is uncertainties in both load and price, the use of EVs storage capacity can make the operation of load aggregator flexible. LA submits its offer to day-ahead market based on predicted loads and optimized use of its EVs to maximize its profit, as well as in real time operation it uses its energy storage capacity in such a way that it can maximize its profit. In this project, load aggregators profit maximization algorithm is formulated and the optimization problem is solved with the help of CVX. As in real time operation the forecasted loads differ from actual load, the mismatches are settled in real time balancing market. Simulation results compare the profit of a load aggregator with a hypothetical group of 1000 EVs and without EVs.

Keywords: CVX, electricity market, load aggregator, load and price uncertainties, profit maximization, real time balancing operation

Procedia PDF Downloads 398