Search results for: traffic measurement and modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7523

Search results for: traffic measurement and modeling

6683 Reliability and Validity for Measurement of Body Composition: A Field Method

Authors: Ahmad Hashim, Zarizi Ab Rahman

Abstract:

Measurement of body composition via a field method has the most popular instruments which are used to estimate the percentage of body fat. Among the instruments used are the Body Mass Index, Bio Impedance Analysis and Skinfold Test. All three of these instruments do not involve high costs, do not require high technical skills, are mobile, save time, and are suitable for use in large populations. Because all three instruments can estimate the percentage of body fat, but it is important to identify the most appropriate instruments and have high reliability. Hence, this study was conducted to determine the reliability and convergent validity of the instruments. A total of 40 students, males and females aged between 13 and 14 years participated in this study. The study found that the test retest and Pearson correlation coefficient of reliability for the three instruments is very high, r = .99. While the inter class reliability also are at high level with r = .99 for Body Mass Index and Bio Impedance Analysis, r = .96 for Skin fold test. Intra class reliability coefficient for these three instruments is too high for Body Mass Index r = .99, Bio Impedance Analysis r = .97, and Skin fold Test r = .90. However, Standard Error of Measurement value for all three instruments indicates the Body Mass Index is the most appropriate instrument with a mean value of .000672 compared with other instruments. The findings show that the Body Mass Index is an instrument which is the most accurate and reliable in estimating body fat percentage for the population studied.

Keywords: reliability, validity, body mass index, bio impedance analysis and skinfold test

Procedia PDF Downloads 338
6682 Advancements in Mathematical Modeling and Optimization for Control, Signal Processing, and Energy Systems

Authors: Zahid Ullah, Atlas Khan

Abstract:

This abstract focuses on the advancements in mathematical modeling and optimization techniques that play a crucial role in enhancing the efficiency, reliability, and performance of these systems. In this era of rapidly evolving technology, mathematical modeling and optimization offer powerful tools to tackle the complex challenges faced by control, signal processing, and energy systems. This abstract presents the latest research and developments in mathematical methodologies, encompassing areas such as control theory, system identification, signal processing algorithms, and energy optimization. The abstract highlights the interdisciplinary nature of mathematical modeling and optimization, showcasing their applications in a wide range of domains, including power systems, communication networks, industrial automation, and renewable energy. It explores key mathematical techniques, such as linear and nonlinear programming, convex optimization, stochastic modeling, and numerical algorithms, that enable the design, analysis, and optimization of complex control and signal processing systems. Furthermore, the abstract emphasizes the importance of addressing real-world challenges in control, signal processing, and energy systems through innovative mathematical approaches. It discusses the integration of mathematical models with data-driven approaches, machine learning, and artificial intelligence to enhance system performance, adaptability, and decision-making capabilities. The abstract also underscores the significance of bridging the gap between theoretical advancements and practical applications. It recognizes the need for practical implementation of mathematical models and optimization algorithms in real-world systems, considering factors such as scalability, computational efficiency, and robustness. In summary, this abstract showcases the advancements in mathematical modeling and optimization techniques for control, signal processing, and energy systems. It highlights the interdisciplinary nature of these techniques, their applications across various domains, and their potential to address real-world challenges. The abstract emphasizes the importance of practical implementation and integration with emerging technologies to drive innovation and improve the performance of control, signal processing, and energy.

Keywords: mathematical modeling, optimization, control systems, signal processing, energy systems, interdisciplinary applications, system identification, numerical algorithms

Procedia PDF Downloads 113
6681 Scale Effects on the Wake Airflow of a Heavy Truck

Authors: Aude Pérard Lecomte, Georges Fokoua, Amine Mehel, Anne Tanière

Abstract:

Air quality in urban areas is deteriorated by pollution, mainly due to the constant increase of the traffic of different types of ground vehicles. In particular, particulate matter pollution with important concentrations in urban areas can cause serious health issues. Characterizing and understanding particle dynamics is therefore essential to establish recommendations to improve air quality in urban areas. To analyze the effects of turbulence on particulate pollutants dispersion, the first step is to focus on the single-phase flow structure and turbulence characteristics in the wake of a heavy truck model. To achieve this, Computational Fluid Dynamics (CFD) simulations were conducted with the aim of modeling the wake airflow of a full- and reduced-scale heavy truck. The Reynolds Average Navier-Stokes (RANS) approach with the Reynolds Stress Model (RSM)as the turbulence model closure was used. The simulations highlight the apparition of a large vortex coming from the under trailer. This vortex belongs to the recirculation region, located in the near-wake of the heavy truck. These vortical structures are expected to have a strong influence on particle dynamics that are emitted by the truck.

Keywords: CDF, heavy truck, recirculation region, reduced scale

Procedia PDF Downloads 219
6680 Regeneration Study on the Athens City Center: Transformation of the Historical Triangle to “Low Pollution and Restricted Vehicle Traffic Zone”

Authors: Chondrogianni Dimitra, Yorgos J. Stephanedes

Abstract:

The impact of the economic crisis, coupled with the aging of the city's old core, is reflected in central Athens. Public and private users, residents, employees, visitors desire the quality upgrading of abandoned buildings and public spaces through environmental upgrading and sustainable mobility, and promotion of the international metropolitan character of the city. In the study, a strategy for reshaping the character and function of the historic Athenian triangle is proposed, aiming at its economic, environmental, and social sustainable development through feasible, meaningful, and non-landscaping solutions of low cost and high positive impact. Sustainable mobility is the main principle in re-planning the study area and transforming it into a “Low Pollution and Limited Vehicle Traffic Zone” is the main strategy. Τhe proposed measures include the development of pedestrian mobility networks by expanding the pedestrian roads and limited-traffic routes, of bicycle networks based on the approved Metropolitan Bicycle Route of Athens, of public transportation networks with new lines of electric mini-buses, and of new regulations for vehicle mobility in the historic triangle. In addition, complementary actions are proposed regarding the provision of Wi-Fi on fixed track media, development of applications that facilitate combined travel and provide real-time data, integration of micromobility (roller skates, Segway, Hoverboard), and its enhancement as a flexible means of personal mobility, and development of car-sharing, ride-sharing and dynamic carpooling initiatives.

Keywords: regeneration plans, sustainable mobility, environmental upgrading, athens historical triangle

Procedia PDF Downloads 168
6679 Reconstruction of Wujiaochang Plaza: A Potential Avenue Towards Sustainability

Authors: Caiwei Chen, Jianhao Li, Jiasong Zhu

Abstract:

The reform and opening-up stimulated economic and technological take-off in China while resulting in massive urbanization and motorization. Wujiaochang area was set as a secondary business district in Shanghai to meet the growing demand, with the reconstruction of Wujiaochang Plaza in 2005 being a milestone of this intended urban renewal. Wujiaochang is now an economically dynamic area providing much larger traffic and transit capacity transportation-wise. However, this rebuilding has completely changed the face of the district. It is, therefore, appropriate to evaluate its impact on neighborhoods and communities while assessing the overall sustainability of such an operation. In this study, via an online questionnaire survey among local residents and daily visitors, we assess the perceptions and the estimated impact of Wujiaochang Plaza's reconstruction. We then confront these results to the 62 answers from local residents to a questionnaire collected on paper. The analysis of our data, along with observation and other forms of information -such as maps analysis or online applications (Dianping)- demonstrate major improvement in economic sustainability but also significant losses in environmental sustainability, especially in terms of active transportation. As for the social viewpoint, local residents' opinions tend to be rather positive, especially regarding traffic safety and access to consumption, despite the lack of connectivity and radical changes induced by Wujiaochang massive transformations. In general, our investigation exposes the overall positive outcomes of Wujiaochang Plaza reconstruction but also unveils major drawbacks, especially in terms of soft mobility and traffic fluidity. We gather that our approach could be of tremendous help for future major urban interventions, as such approaches in municipal regeneration are widely implemented in Chinese cities and yet still need to be thoroughly assessed in terms of sustainability.

Keywords: China's reform and opening-up, economical revitalization, neighborhood identity, sustainability assessment, urban renewal

Procedia PDF Downloads 242
6678 Comparison of Intraocular Pressure Measurement Prior and Following Full Intracorneal Ring Implantation in Patient with Keratoconus by Three Different Instruments

Authors: Seyed Aliasghar Mosavi, Mostafa Naderi, Khosrow Jadidi, Amir Hashem Mohammadi

Abstract:

To study the measurement of intraocular pressure (IOP) before and after implantation of intrastromal corneal ring (MyoRing) in patients with keratoconus. Setting: Baqiyatallah University of Medical Sciences, Tehran, Iran. Methods: We compared the IOP of 13 eyes which underwent MyoRing implantation prior and six months post operation using Goldman applanation (as gold standard), Icare, and Corvis ST (uncorrected, corrected and corrected with cornea biomechanics). Results: The resulting intraocular pressure measurements prior to surgery, Icare, Corvis (corrected with cornea biomechanics) overestimated the IOP, however measurements by Corvis uncorrected underestimate the IOP. The resulting intraocular pressure measurements after surgery, Icare, Corvis (corrected with cornea biomechanics) overestimated the IOP but measurements by Corvis uncorrected underestimate the IOP. Conclusion: Consistent intraocular pressure measurements on eyes with Myoring in keratoconus can be obtained with the Goldman applanation tonometer as the gold standard measurement. We were not able to obtain consistent results when we measured the IOP by Icare and Corvis prior and after surgery.

Keywords: intraocular pressure, MyoRing, Keratoconus, Goldmann applanation, Icare, Corvis ST

Procedia PDF Downloads 245
6677 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement

Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao

Abstract:

Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.

Keywords: feature analysis, machine vision, PCA, surface roughness, SVM

Procedia PDF Downloads 213
6676 Vulnerable Paths Assessment for Distributed Denial of Service Attacks in a Cloud Computing Environment

Authors: Manas Tripathi, Arunabha Mukhopadhyay

Abstract:

In Cloud computing environment, cloud servers, sometimes may crash after receiving huge amount of request and cloud services may stop which can create huge loss to users of that cloud services. This situation is called Denial of Service (DoS) attack. In Distributed Denial of Service (DDoS) attack, an attacker targets multiple network paths by compromising various vulnerable systems (zombies) and floods the victim with huge amount of request through these zombies. There are many solutions to mitigate this challenge but most of the methods allows the attack traffic to arrive at Cloud Service Provider (CSP) and then only takes actions against mitigation. Here in this paper we are rather focusing on preventive mechanism to deal with these attacks. We analyze network topology and find most vulnerable paths beforehand without waiting for the traffic to arrive at CSP. We have used Dijkstra's and Yen’s algorithm. Finally, risk assessment of these paths can be done by multiplying the probabilities of attack for these paths with the potential loss.

Keywords: cloud computing, DDoS, Dijkstra, Yen’s k-shortest path, network security

Procedia PDF Downloads 278
6675 Biosignal Measurement System Based on Ultra-Wide Band Human Body Communication

Authors: Jonghoon Kim, Gilwon Yoon

Abstract:

A wrist-band type biosignal measurement system and its data transfer through human body communication (HBC) were investigated. An HBC method based on pulses of ultra-wide band instead of using frequency or amplitude modulations was studied and implemented since the system became very compact and it was more suited for personal or mobile health monitoring. Our system measured photo-plethysmogram (PPG) and measured PPG signals were transmitted through a finger to a monitoring PC system. The device was compact and low-power consuming. HBC communication has very strong security measures since it does not use wireless network. Furthermore, biosignal monitoring system becomes handy because it does not need to have wire connections.

Keywords: biosignal, human body communication, mobile health, PPG, ultrawide band

Procedia PDF Downloads 476
6674 Development of a Force-Sensing Toothbrush for Gum Recession Measurement Using Programmable Automation Controller

Authors: Sorayya Kazemi, Hamed Kharrati, Mehdi Abedinpour Fallah

Abstract:

This paper presents the design and implementation of a novel electric pressure-sensitive toothbrush, capable of measuring the forces applied to the head of the brush. The developed device is used for gum recession measurement. In particular, the percentage of gum recession is measured by a Programmable Automation controller (PAC). Moreover, the brushing forces are measured by a Force Sensing Resistor (FSR) sensor. These forces are analog inputs of PAC. According to the applied forces during patient’s brushing and the patient’s percentage of gum recession, dentist sets the standard force range. The instrument alarms when the patient applies a force over the set range.

Keywords: gum recession, force sensing resistor, controller, toothbrush

Procedia PDF Downloads 497
6673 Development of a Performance Measurement Model for Hospitals Using Multi-Criteria Decision Making (MCDM) Techniques: A Case Study of Three South Australian Major Public Hospitals

Authors: Mohammad Safaeipour, Yousef Amer

Abstract:

This study directs its focus on developing a conceptual model to offer a systematic and integrated method to weigh the related measures and evaluate a competence of hospitals and rank of the selected hospitals that involve and consider the stakeholders’ key performance indicators (KPI’s). The Analytical Hierarchy Process (AHP) approach will use to weigh the dimensions and related sub- components. The weights and performance scores will combine by using the Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) and rank the selected hospitals. The results of this study provide interesting insight into the necessity of process improvement implementation in which hospital that received the lowest ranking score.

Keywords: performance measurement system, PMS, hospitals, AHP, TOPSIS

Procedia PDF Downloads 374
6672 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 132
6671 A Systematic Categorization of Arguments against the Vision Zero Goal: A Literature Review

Authors: Henok Girma Abebe

Abstract:

The Vision Zero is a long-term goal of preventing all road traffic fatalities and serious injuries which was first adopted in Sweden in 1997. It is based on the assumption that death and serious injury in the road system is morally unacceptable. In order to approach this end, vision zero has put in place strategies that are radically different from the traditional safety work. The vision zero, for instance, promoted the adoption of the best available technology to promote safety, and placed the ultimate responsibility for traffic safety on system designers. Despite Vision Zero’s moral appeal and its expansion to different safety areas and also parts of the world, important philosophical concerns related to the adoption and implementation of the vision zero remain to be addressed. Moreover, the vision zero goal has been criticized on different grounds. The aim of this paper is to identify and systematically categorize criticisms that have been put forward against vision zero. The findings of the paper are solely based on a critical analysis of secondary sources and snowball method is employed to identify the relevant philosophical and empirical literatures. Two general categories of criticisms on the vision zero goal are identified. The first category consists of criticisms that target the setting of vision zero as a ‘goal’ and some of the basic assumptions upon which the goal is based. Among others, the goal of achieving zero fatalities and serious injuries, together with vision zero’s lexicographical prioritization of safety has been criticized as unrealistic. The second category consists of criticisms that target the strategies put in place to achieve the goal of zero fatalities and serious injuries. For instance, Vision zero’s responsibility ascription for road safety and its rejection of cost-benefit analysis in the formulation and adoption of safety measures has both been criticized as counterproductive. In this category also falls the criticism that Vision Zero safety measures tend to be too paternalistic. Significant improvements have been recorded in road safety work since the adoption of vision zero, however, for the vision zero to even succeed more, it is important that issues and criticisms of philosophical nature associated with it are identified and critically dealt with.

Keywords: criticisms, systems approach, traffic safety, vision zero

Procedia PDF Downloads 303
6670 Assessment of Solid Insulating Material Using Partial Discharge Characteristics

Authors: Qasim Khan, Furkan Ahmad, Asfar A. Khan, M. Saad Alam, Faiz Ahmad

Abstract:

In this paper, partial discharge analysis is performed in cavities artificially created in insulation. The setup is according with Cigre-II Method. Circular Samples created from Perspex Sheet with different configuration with changing number of cavities. Assessment of insulation health can be performed by Partial Discharge measurement as this has been found to be important means of condition monitoring. The experiments are done using MPD 540, which is a modern partial discharge measurement system. By analyzing the PD activity obtained for various voids/cavities, it is observed that the PD voltages show variation for cavity’s diameter, depth even for its ratios. This can be employed for scrutiny of insulation system.

Keywords: partial discharges, condition monitoring, insulation defects, degradation and corrosion, PMMA

Procedia PDF Downloads 518
6669 Wheel Diameter and Width Influence in Variability of Brake Data Measurement at Ministry of Transport Facilities

Authors: Carolina Senabre, Sergio Valero, Emilio Velasco

Abstract:

The brake systems of vehicles are tested periodically by a “brake tester” at Ministry of Transport (MOT) stations. This tester measures the effectiveness of vehicle. This parameter is established by the International Committee of Vehicle Inspection (CITA). In this paper, we present an investigation of the influence of the tire size on the measurements of brake force on three MOT brake testers. We performed an analysis of the vehicle braking capacity test at MOT stations. The influence of varying wheel diameter and width on the measurement of braking at MOT stations has been analyzed. Thereby, the MOT brake tester as a verification system for a vehicle has been evaluated.

Keywords: brake tester, ministry of transport facilities, wheel diameter, efficiency

Procedia PDF Downloads 376
6668 Modeling Continuous Flow in a Curved Channel Using Smoothed Particle Hydrodynamics

Authors: Indri Mahadiraka Rumamby, R. R. Dwinanti Rika Marthanty, Jessica Sjah

Abstract:

Smoothed particle hydrodynamics (SPH) was originally created to simulate nonaxisymmetric phenomena in astrophysics. However, this method still has several shortcomings, namely the high computational cost required to model values with high resolution and problems with boundary conditions. The difficulty of modeling boundary conditions occurs because the SPH method is influenced by particle deficiency due to the integral of the kernel function being truncated by boundary conditions. This research aims to answer if SPH modeling with a focus on boundary layer interactions and continuous flow can produce quantifiably accurate values with low computational cost. This research will combine algorithms and coding in the main program of meandering river, continuous flow algorithm, and solid-fluid algorithm with the aim of obtaining quantitatively accurate results on solid-fluid interactions with the continuous flow on a meandering channel using the SPH method. This study uses the Fortran programming language for modeling the SPH (Smoothed Particle Hydrodynamics) numerical method; the model is conducted in the form of a U-shaped meandering open channel in 3D, where the channel walls are soil particles and uses a continuous flow with a limited number of particles.

Keywords: smoothed particle hydrodynamics, computational fluid dynamics, numerical simulation, fluid mechanics

Procedia PDF Downloads 133
6667 Forecasting Etching Behavior Silica Sand Using the Design of Experiments Method

Authors: Kefaifi Aissa, Sahraoui Tahar, Kheloufi Abdelkrim, Anas Sabiha, Hannane Farouk

Abstract:

The aim of this study is to show how the Design of Experiments Method (DOE) can be put into use as a practical approach for silica sand etching behavior modeling during its primary step of leaching. In the present work, we have studied etching effect on particle size during a primary step of leaching process on Algerian silica sand with florid acid (HF) at 20% and 30 % during 4 and 8 hours. Therefore, a new purity of the sand is noted depending on the time of leaching. This study was expanded by a numerical approach using a method of experiment design, which shows the influence of each parameter and the interaction between them in the process and approved the obtained experimental results. This model is a predictive approach using hide software. Based on the measured parameters experimentally in the interior of the model, the use of DOE method can make it possible to predict the outside parameters of the model in question and can give us the optimize response without making the experimental measurement.

Keywords: acid leaching, design of experiments method(DOE), purity silica, silica etching

Procedia PDF Downloads 286
6666 The Use of Unmanned Aerial System (UAS) in Improving the Measurement System on the Example of Textile Heaps

Authors: Arkadiusz Zurek

Abstract:

The potential of using drones is visible in many areas of logistics, especially in terms of their use for monitoring and control of many processes. The technologies implemented in the last decade concern new possibilities for companies that until now have not even considered them, such as warehouse inventories. Unmanned aerial vehicles are no longer seen as a revolutionary tool for Industry 4.0, but rather as tools in the daily work of factories and logistics operators. The research problem is to develop a method for measuring the weight of goods in a selected link of the clothing supply chain by drones. However, the purpose of this article is to analyze the causes of errors in traditional measurements, and then to identify adverse events related to the use of drones for the inventory of a heap of textiles intended for production purposes. On this basis, it will be possible to develop guidelines to eliminate the causes of these events in the measurement process using drones. In a real environment, work was carried out to determine the volume and weight of textiles, including, among others, weighing a textile sample to determine the average density of the assortment, establishing a local geodetic network, terrestrial laser scanning and photogrammetric raid using an unmanned aerial vehicle. As a result of the analysis of measurement data obtained in the facility, the volume and weight of the assortment and the accuracy of their determination were determined. In this article, this work presents how such heaps are currently being tested, what adverse events occur, indicate and describes the current use of photogrammetric techniques of this type of measurements so far performed by external drones for the inventory of wind farms or construction of the station and compare them with the measurement system of the aforementioned textile heap inside a large-format facility.

Keywords: drones, unmanned aerial system, UAS, indoor system, security, process automation, cost optimization, photogrammetry, risk elimination, industry 4.0

Procedia PDF Downloads 87
6665 Handling Complexity of a Complex System Design: Paradigm, Formalism and Transformations

Authors: Hycham Aboutaleb, Bruno Monsuez

Abstract:

Current systems' complexity has reached a degree that requires addressing conception and design issues while taking into account environmental, operational, social, legal, and financial aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponentially growing effort, cost, and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework, and an environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and suggests a graph-based formalism for modeling complex systems. Finally, a set of transformations are defined to handle the model complexity.

Keywords: higraph-based, formalism, system engineering paradigm, modeling requirements, graph-based transformations

Procedia PDF Downloads 403
6664 Hybrid Anomaly Detection Using Decision Tree and Support Vector Machine

Authors: Elham Serkani, Hossein Gharaee Garakani, Naser Mohammadzadeh, Elaheh Vaezpour

Abstract:

Intrusion detection systems (IDS) are the main components of network security. These systems analyze the network events for intrusion detection. The design of an IDS is through the training of normal traffic data or attack. The methods of machine learning are the best ways to design IDSs. In the method presented in this article, the pruning algorithm of C5.0 decision tree is being used to reduce the features of traffic data used and training IDS by the least square vector algorithm (LS-SVM). Then, the remaining features are arranged according to the predictor importance criterion. The least important features are eliminated in the order. The remaining features of this stage, which have created the highest level of accuracy in LS-SVM, are selected as the final features. The features obtained, compared to other similar articles which have examined the selected features in the least squared support vector machine model, are better in the accuracy, true positive rate, and false positive. The results are tested by the UNSW-NB15 dataset.

Keywords: decision tree, feature selection, intrusion detection system, support vector machine

Procedia PDF Downloads 266
6663 Distributed Multi-Agent Based Approach on Intelligent Transportation Network

Authors: Xiao Yihong, Yu Kexin, Burra Venkata Durga Kumar

Abstract:

With the accelerating process of urbanization, the problem of urban road congestion is becoming more and more serious. Intelligent transportation system combining distributed and artificial intelligence has become a research hotspot. As the core development direction of the intelligent transportation system, Cooperative Intelligent Transportation System (C-ITS) integrates advanced information technology and communication methods and realizes the integration of humans, vehicle, roadside infrastructure, and other elements through the multi-agent distributed system. By analyzing the system architecture and technical characteristics of C-ITS, the report proposes a distributed multi-agent C-ITS. The system consists of Roadside Sub-system, Vehicle Sub-system, and Personal Sub-system. At the same time, we explore the scalability of the C-ITS and put forward incorporating local rewards in the centralized training decentralized execution paradigm, hoping to add a scalable value decomposition method. In addition, we also suggest introducing blockchain to improve the safety of the traffic information transmission process. The system is expected to improve vehicle capacity and traffic safety.

Keywords: distributed system, artificial intelligence, multi-agent, cooperative intelligent transportation system

Procedia PDF Downloads 215
6662 Hypergraph for System of Systems modeling

Authors: Haffaf Hafid

Abstract:

Hypergraphs, after being used to model the structural organization of System of Sytems (SoS) at macroscopic level, has recent trends towards generalizing this powerful representation at different stages of complex system modelling. In this paper, we first describe different applications of hypergraph theory, and step by step, introduce multilevel modeling of SoS by means of integrating Constraint Programming Langages (CSP) dealing with engineering system reconfiguration strategy. As an application, we give an A.C.T Terminal controlled by a set of Intelligent Automated Vehicle.

Keywords: hypergraph model, structural analysis, bipartite graph, monitoring, system of systems, reconfiguration analysis, hypernetwork

Procedia PDF Downloads 489
6661 Incorporating Multiple Supervised Learning Algorithms for Effective Intrusion Detection

Authors: Umar Albalawi, Sang C. Suh, Jinoh Kim

Abstract:

As internet continues to expand its usage with an enormous number of applications, cyber-threats have significantly increased accordingly. Thus, accurate detection of malicious traffic in a timely manner is a critical concern in today’s Internet for security. One approach for intrusion detection is to use Machine Learning (ML) techniques. Several methods based on ML algorithms have been introduced over the past years, but they are largely limited in terms of detection accuracy and/or time and space complexity to run. In this work, we present a novel method for intrusion detection that incorporates a set of supervised learning algorithms. The proposed technique provides high accuracy and outperforms existing techniques that simply utilizes a single learning method. In addition, our technique relies on partial flow information (rather than full information) for detection, and thus, it is light-weight and desirable for online operations with the property of early identification. With the mid-Atlantic CCDC intrusion dataset publicly available, we show that our proposed technique yields a high degree of detection rate over 99% with a very low false alarm rate (0.4%).

Keywords: intrusion detection, supervised learning, traffic classification, computer networks

Procedia PDF Downloads 353
6660 A Novel Algorithm for Parsing IFC Models

Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai

Abstract:

Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD).

Keywords: BIM, CAD, IFC, MVD

Procedia PDF Downloads 300
6659 Using Google Distance Matrix Application Programming Interface to Reveal and Handle Urban Road Congestion Hot Spots: A Case Study from Budapest

Authors: Peter Baji

Abstract:

In recent years, a growing body of literature emphasizes the increasingly negative impacts of urban road congestion in the everyday life of citizens. Although there are different responses from the public sector to decrease traffic congestion in urban regions, the most effective public intervention is using congestion charges. Because travel is an economic asset, its consumption can be controlled by extra taxes or prices effectively, but this demand-side intervention is often unpopular. Measuring traffic flows with the help of different methods has a long history in transport sciences, but until recently, there was not enough sufficient data for evaluating road traffic flow patterns on the scale of an entire road system of a larger urban area. European cities (e.g., London, Stockholm, Milan), in which congestion charges have already been introduced, designated a particular zone in their downtown for paying, but it protects only the users and inhabitants of the CBD (Central Business District) area. Through the use of Google Maps data as a resource for revealing urban road traffic flow patterns, this paper aims to provide a solution for a fairer and smarter congestion pricing method in cities. The case study area of the research contains three bordering districts of Budapest which are linked by one main road. The first district (5th) is the original downtown that is affected by the congestion charge plans of the city. The second district (13th) lies in the transition zone, and it has recently been transformed into a new CBD containing the biggest office zone in Budapest. The third district (4th) is a mainly residential type of area on the outskirts of the city. The raw data of the research was collected with the help of Google’s Distance Matrix API (Application Programming Interface) which provides future estimated traffic data via travel times between freely fixed coordinate pairs. From the difference of free flow and congested travel time data, the daily congestion patterns and hot spots are detectable in all measured roads within the area. The results suggest that the distribution of congestion peak times and hot spots are uneven in the examined area; however, there are frequently congested areas which lie outside the downtown and their inhabitants also need some protection. The conclusion of this case study is that cities can develop a real-time and place-based congestion charge system that forces car users to avoid frequently congested roads by changing their routes or travel modes. This would be a fairer solution for decreasing the negative environmental effects of the urban road transportation instead of protecting a very limited downtown area.

Keywords: Budapest, congestion charge, distance matrix API, application programming interface, pilot study

Procedia PDF Downloads 200
6658 A Practical and Efficient Evaluation Function for 3D Model Based Vehicle Matching

Authors: Yuan Zheng

Abstract:

3D model-based vehicle matching provides a new way for vehicle recognition, localization and tracking. Its key is to construct an evaluation function, also called fitness function, to measure the degree of vehicle matching. The existing fitness functions often poorly perform when the clutter and occlusion exist in traffic scenarios. In this paper, we present a practical and efficient fitness function. Unlike the existing evaluation functions, the proposed fitness function is to study the vehicle matching problem from both local and global perspectives, which exploits the pixel gradient information as well as the silhouette information. In view of the discrepancy between 3D vehicle model and real vehicle, a weighting strategy is introduced to differently treat the fitting of the model’s wireframes. Additionally, a normalization operation for the model’s projection is performed to improve the accuracy of the matching. Experimental results on real traffic videos reveal that the proposed fitness function is efficient and robust to the cluttered background and partial occlusion.

Keywords: 3D-2D matching, fitness function, 3D vehicle model, local image gradient, silhouette information

Procedia PDF Downloads 399
6657 In-Plane Shear Tests of Prefabricated Masonry Panel System with Two-Component Polyurethane Adhesive

Authors: Ekkehard Fehling, Paul Capewell

Abstract:

In recent years, the importance of masonry glued by polyurethane adhesive has increased. In 2021, the Institute of Structural Engineering of the University of Kassel was commissioned to carry out quasi-static in-plane shear tests on prefabricated brick masonry panel systems with 2K PUR adhesive in order to investigate the load-bearing behavior during earthquakes. In addition to the usual measurement of deformations using displacement transducers, all tests were documented using an optical measuring system (“GOM”), which was used to determine the surface strains and deformations of the test walls. To compare the results with conventional mortar walls, additional reference tests were carried out on test specimens with thin-bed mortar joints. This article summarizes the results of the test program and provides a comparison between the load-bearing behavior of masonry bonded with polyurethane adhesive and thin bed mortar in order to enable realistic non-linear modeling.

Keywords: masonry, shear tests, in-plane, polyurethane adhesive

Procedia PDF Downloads 72
6656 Hominin Niche in the Times of Climate Change

Authors: Emilia Hunt, Sally C. Reynolds, Fiona Coward, Fabio Parracho Silva, Philip Hopley

Abstract:

Ecological niche modeling is widely used in conservation studies, but application to the extinct hominin species is a relatively new approach. Being able to understand what ecological niches were occupied by respective hominin species provides a new perspective into influences on evolutionary processes. Niche separation or overlap can tell us more about specific requirements of the species within the given timeframe. Many of the ancestral species lived through enormous climate changes: glacial and interglacial periods, changes in rainfall, leading to desertification or flooding of regions and displayed impressive levels of adaptation necessary for their survival. This paper reviews niche modeling methodologies and their application to hominin studies. Traditional conservation methods might not be directly applicable to extinct species and are not comparable to hominins. Hominin niche also includes aspects of technologies, use of fire and extended communication, which are not traditionally used in building conservation models. Future perspectives on how to improve niche modeling for extinct hominin species will be discussed.

Keywords: hominin niche, climate change, evolution, adaptation, ecological niche modelling

Procedia PDF Downloads 191
6655 National Assessment for Schools in Saudi Arabia: Score Reliability and Plausible Values

Authors: Dimiter M. Dimitrov, Abdullah Sadaawi

Abstract:

The National Assessment for Schools (NAFS) in Saudi Arabia consists of standardized tests in Mathematics, Reading, and Science for school grade levels 3, 6, and 9. One main goal is to classify students into four categories of NAFS performance (minimal, basic, proficient, and advanced) by schools and the entire national sample. The NAFS scoring and equating is performed on a bounded scale (D-scale: ranging from 0 to 1) in the framework of the recently developed “D-scoring method of measurement.” The specificity of the NAFS measurement framework and data complexity presented both challenges and opportunities to (a) the estimation of score reliability for schools, (b) setting cut-scores for the classification of students into categories of performance, and (c) generating plausible values for distributions of student performance on the D-scale. The estimation of score reliability at the school level was performed in the framework of generalizability theory (GT), with students “nested” within schools and test items “nested” within test forms. The GT design was executed via a multilevel modeling syntax code in R. Cut-scores (on the D-scale) for the classification of students into performance categories was derived via a recently developed method of standard setting, referred to as “Response Vector for Mastery” (RVM) method. For each school, the classification of students into categories of NAFS performance was based on distributions of plausible values for the students’ scores on NAFS tests by grade level (3, 6, and 9) and subject (Mathematics, Reading, and Science). Plausible values (on the D-scale) for each individual student were generated via random selection from a statistical logit-normal distribution with parameters derived from the student’s D-score and its conditional standard error, SE(D). All procedures related to D-scoring, equating, generating plausible values, and classification of students into performance levels were executed via a computer program in R developed for the purpose of NAFS data analysis.

Keywords: large-scale assessment, reliability, generalizability theory, plausible values

Procedia PDF Downloads 21
6654 Evaluation of Urban Transportation Systems: Comparing and Selecting the Most Efficient Transportation Solutions

Authors: E. Azizi Asiyabar

Abstract:

The phenomenon of migration to larger cities has brought about a range of consequences, including increased travel demand and the necessity for smooth traffic flow to expedite transportation. Regrettably, insufficient urban transportation infrastructure has given rise to various issues, including air pollution, heightened fuel consumption, and wasted time. To address traffic-related problems and the economic, social, and environmental challenges that ensue, a well-equipped, efficient, fast, cost-effective, and high-capacity transportation system is imperative, with a focus on reliability. This study undertakes a comprehensive examination of rail transportation systems and subsequently compares their advantages and limitations. The findings of this investigation reveal that hybrid monorails exhibit lower maintenance requirements and associated costs when compared to other types of monorails, standard trains, and urban light rail systems. Given their favorable attributes in terms of pollution reduction, increased transportation speed, and enhanced quality of service, hybrid monorails emerge as a highly recommended and suitable option.

Keywords: comparing, most efficient, selecting, urban transportation

Procedia PDF Downloads 84