Search results for: time workflow network
18262 Combination between Intrusion Systems and Honeypots
Authors: Majed Sanan, Mohammad Rammal, Wassim Rammal
Abstract:
Today, security is a major concern. Intrusion Detection, Prevention Systems and Honeypot can be used to moderate attacks. Many researchers have proposed to use many IDSs ((Intrusion Detection System) time to time. Some of these IDS’s combine their features of two or more IDSs which are called Hybrid Intrusion Detection Systems. Most of the researchers combine the features of Signature based detection methodology and Anomaly based detection methodology. For a signature based IDS, if an attacker attacks slowly and in organized way, the attack may go undetected through the IDS, as signatures include factors based on duration of the events but the actions of attacker do not match. Sometimes, for an unknown attack there is no signature updated or an attacker attack in the mean time when the database is updating. Thus, signature-based IDS fail to detect unknown attacks. Anomaly based IDS suffer from many false-positive readings. So there is a need to hybridize those IDS which can overcome the shortcomings of each other. In this paper we propose a new approach to IDS (Intrusion Detection System) which is more efficient than the traditional IDS (Intrusion Detection System). The IDS is based on Honeypot Technology and Anomaly based Detection Methodology. We have designed Architecture for the IDS in a packet tracer and then implemented it in real time. We have discussed experimental results performed: both the Honeypot and Anomaly based IDS have some shortcomings but if we hybridized these two technologies, the newly proposed Hybrid Intrusion Detection System (HIDS) is capable enough to overcome these shortcomings with much enhanced performance. In this paper, we present a modified Hybrid Intrusion Detection System (HIDS) that combines the positive features of two different detection methodologies - Honeypot methodology and anomaly based intrusion detection methodology. In the experiment, we ran both the Intrusion Detection System individually first and then together and recorded the data from time to time. From the data we can conclude that the resulting IDS are much better in detecting intrusions from the existing IDSs.Keywords: security, intrusion detection, intrusion prevention, honeypot, anomaly-based detection, signature-based detection, cloud computing, kfsensor
Procedia PDF Downloads 38318261 A Time-Varying and Non-Stationary Convolution Spectral Mixture Kernel for Gaussian Process
Authors: Kai Chen, Shuguang Cui, Feng Yin
Abstract:
Gaussian process (GP) with spectral mixture (SM) kernel demonstrates flexible non-parametric Bayesian learning ability in modeling unknown function. In this work a novel time-varying and non-stationary convolution spectral mixture (TN-CSM) kernel with a significant enhancing of interpretability by using process convolution is introduced. A way decomposing the SM component into an auto-convolution of base SM component and parameterizing it to be input dependent is outlined. Smoothly, performing a convolution between two base SM component yields a novel structure of non-stationary SM component with much better generalized expression and interpretation. The TN-CSM perfectly allows compatibility with the stationary SM kernel in terms of kernel form and spectral base ignored and confused by previous non-stationary kernels. On synthetic and real-world datatsets, experiments show the time-varying characteristics of hyper-parameters in TN-CSM and compare the learning performance of TN-CSM with popular and representative non-stationary GP.Keywords: Gaussian process, spectral mixture, non-stationary, convolution
Procedia PDF Downloads 19618260 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques
Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet
Abstract:
5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics
Procedia PDF Downloads 6318259 Using a Card Game as a Tool for Developing a Design
Authors: Matthias Haenisch, Katharina Hermann, Marc Godau, Verena Weidner
Abstract:
Over the past two decades, international music education has been characterized by a growing interest in informal learning for formal contexts and a "compositional turn" that has moved from closed to open forms of composing. This change occurs under social and technological conditions that permeate 21st-century musical practices. This forms the background of Musical Communities in the (Post)Digital Age (MusCoDA), a four-year joint research project of the University of Erfurt (UE) and the University of Education Karlsruhe (PHK), funded by the German Federal Ministry of Education and Research (BMBF). Both explore songwriting processes as an example of collective creativity in (post)digital communities, one in formal and the other in informal learning contexts. Collective songwriting will be studied from a network perspective, that will allow us to view boundaries between both online and offline as well as formal and informal or hybrid contexts as permeable and to reconstruct musical learning practices. By comparing these songwriting processes, possibilities for a pedagogical-didactic interweaving of different educational worlds are highlighted. Therefore, the subproject of the University of Erfurt investigates school music lessons with the help of interviews, videography, and network maps by analyzing new digital pedagogical and didactic possibilities. In the first step, the international literature on songwriting in the music classroom was examined for design development. The analysis focused on the question of which methods and practices are circulating in the current literature. Results from this stage of the project form the basis for the first instructional design that will help teachers in planning regular music classes and subsequently reconstruct musical learning practices under these conditions. In analyzing the literature, we noticed certain structural methods and concepts that recur, such as the Building Blocks method and the pre-structuring of the songwriting process. From these findings, we developed a deck of cards that both captures the current state of research and serves as a method for design development. With this deck of cards, both teachers and students themselves can plan their individual songwriting lessons by independently selecting and arranging topic, structure, and action cards. In terms of science communication, music educators' interactions with the card game provide us with essential insights for developing the first design. The overall goal of MusCoDA is to develop an empirical model of collective musical creativity and learning and an instructional design for teaching music in the postdigital age.Keywords: card game, collective songwriting, community of practice, network, postdigital
Procedia PDF Downloads 6418258 Analysis of Silicon Controlled Rectifier-Based Electrostatic Discharge Protection Circuits with Electrical Characteristics for the 5V Power Clamp
Authors: Jun-Geol Park, Kyoung-Il Do, Min-Ju Kwon, Kyung-Hyun Park, Yong-Seo Koo
Abstract:
This paper analyzed the SCR (Silicon Controlled Rectifier)-based ESD (Electrostatic Discharge) protection circuits with the turn-on time characteristics. The structures are the LVTSCR (Low Voltage Triggered SCR), the ZTSCR (Zener Triggered SCR) and the PTSCR (P-Substrate Triggered SCR). The three structures are for the 5V power clamp. In general, the structures with the low trigger voltage structure can have the fast turn-on characteristics than other structures. All the ESD protection circuits have the low trigger voltage by using the N+ bridge region of LVTSCR, by using the zener diode structure of ZTSCR, by increasing the trigger current of PTSCR. The simulation for the comparison with the turn-on time was conducted by the Synopsys TCAD simulator. As the simulation results, the LVTSCR has the turn-on time of 2.8 ns, ZTSCR of 2.1 ns and the PTSCR of 2.4 ns. The HBM simulation results, however, show that the PTSCR is the more robust structure of 430K in HBM 8kV standard than 450K of LVTSCR and 495K of ZTSCR. Therefore the PTSCR is the most effective ESD protection circuit for the 5V power clamp.Keywords: ESD, SCR, turn-on time, trigger voltage, power clamp
Procedia PDF Downloads 34818257 Mapping the Pain Trajectory of Breast Cancer Survivors: Results from a Retrospective Chart Review
Authors: Wilfred Elliam
Abstract:
Background: Pain is a prevalent and debilitating symptom among breast cancer patients, impacting their quality of life and overall well-being. The experience of pain in this population is multifaceted, influenced by a combination of disease-related factors, treatment side effects, and individual characteristics. Despite advancements in cancer treatment and pain management, many breast cancer patients continue to suffer from chronic pain, which can persist long after the completion of treatment. Understanding the progression of pain in breast cancer patients over time and identifying its correlates is crucial for effective pain management and supportive care strategies. The purpose of this research is to understand the patterns and progression of pain experienced by breast cancer survivors over time. Methods: Data were collected from breast cancer patients at Hartford Hospital at four time points: baseline, 3, 6 and 12 weeks. Key variables measured include pain, body mass index (BMI), fatigue, musculoskeletal pain, sleep disturbance, and demographic variables (age, employment status, cancer stage, and ethnicity). Binomial generalized linear mixed models were used to examine changes in pain and symptoms over time. Results: A total of 100 breast cancer patients aged 18 years old were included in the analysis. We found that the effect of time on pain (p = 0.024), musculoskeletal pain (p= <0.001), fatigue (p= <0.001), and sleep disturbance (p-value = 0.013) were statistically significant with pain progression in breast cancer patients. Patients using aromatase inhibitors have worse fatigue (<0.05) and musculoskeletal pain (<0.001) compared to patients with Tamoxifen. Patients who are obese (<0.001) and overweight (<0.001) are more likely to report pain compared to patients with normal weight. Conclusion: This study revealed the complex interplay between various factors such as time, pain, sleep disturbance in breast cancer patient. Specifically, pain, musculoskeletal pain, sleep disturbance, fatigue exhibited significant changes across the measured time points, indicating a dynamic pain progression in these patients. The findings provide a foundation for future research and targeted interventions aimed at improving pain in breast cancer patient outcomes.Keywords: breast cancer, chronic pain, pain management, quality of life
Procedia PDF Downloads 3118256 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition
Authors: Aisultan Shoiynbek, Darkhan Kuanyshbay, Paulo Menezes, Akbayan Bekarystankyzy, Assylbek Mukhametzhanov, Temirlan Shoiynbek
Abstract:
Speech emotion recognition (SER) has received increasing research interest in recent years. It is a common practice to utilize emotional speech collected under controlled conditions recorded by actors imitating and artificially producing emotions in front of a microphone. There are four issues related to that approach: emotions are not natural, meaning that machines are learning to recognize fake emotions; emotions are very limited in quantity and poor in variety of speaking; there is some language dependency in SER; consequently, each time researchers want to start work with SER, they need to find a good emotional database in their language. This paper proposes an approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describes the sequence of actions involved in the proposed approach. One of the first objectives in the sequence of actions is the speech detection issue. The paper provides a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To investigate the working capacity of the developed model, an analysis of speech detection and extraction from real tasks has been performed.Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset
Procedia PDF Downloads 2618255 Increase Productivity by Using Work Measurement Technique
Authors: Mohammed Al Awadh
Abstract:
In order for businesses to take advantage of the opportunities for expanded production and trade that have arisen as a result of globalization and increased levels of competition, productivity growth is required. The number of available sources is decreasing with each passing day, which results in an ever-increasing demand. In response to this, there will be an increased demand placed on firms to improve the efficiency with which they utilise their resources. As a scientific method, work and time research techniques have been employed in all manufacturing and service industries to raise the efficiency of use of the factors of production. These approaches focus on work and time. The goal of this research is to improve the productivity of a manufacturing industry's production system by looking at ways to measure work. The work cycles were broken down into more manageable and quantifiable components. On the observation sheet, these aspects were noted down. The operation has been properly analysed in order to identify value-added and non-value-added components, and observations have been recorded for each of the different trails.Keywords: time study, work measurement, work study, efficiency
Procedia PDF Downloads 6918254 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing
Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais
Abstract:
Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query
Procedia PDF Downloads 20318253 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 14618252 FSO Performance under High Solar Irradiation: Case Study Qatar
Authors: Syed Jawad Hussain, Abir Touati, Farid Touati
Abstract:
Free-Space Optics (FSO) is a wireless technology that enables the optical transmission of data though the air. FSO is emerging as a promising alternative or complementary technology to fiber optic and wireless radio-frequency (RF) links due to its high-bandwidth, robustness to EMI, and operation in unregulated spectrum. These systems are envisioned to be an essential part of future generation heterogeneous communication networks. Despite the vibrant advantages of FSO technology and the variety of its applications, its widespread adoption has been hampered by rather disappointing link reliability for long-range links due to atmospheric turbulence-induced fading and sensitivity to detrimental climate conditions. Qatar, with modest cloud coverage, high concentrations of airborne dust and high relative humidity particularly lies in virtually rainless sunny belt with a typical daily average solar radiation exceeding 6 kWh/m2 and 80-90% clear skies throughout the year. The specific objective of this work is to study for the first time in Qatar the effect of solar irradiation on the deliverability of the FSO Link. In order to analyze the transport media, we have ported Embedded Linux kernel on Field Programmable Gate Array (FPGA) and designed a network sniffer application that can run into FPGA. We installed new FSO terminals and configure and align them successively. In the reporting period, we carry out measurement and relate them to weather conditions.Keywords: free space optics, solar irradiation, field programmable gate array, FSO outage
Procedia PDF Downloads 36118251 Neural Network Approach For Clustering Host Community: Based on Perceptions Toward Tourism, Their Satisfaction Level and Demographic Attributes in Iran (Lahijan)
Authors: Nasibeh Mohammadpour, Ali Rajabzadeh, Adel Azar, Hamid Zargham Borujeni,
Abstract:
Generally, various industries development depends on their stakeholders and beneficiaries supports. One of the most important stakeholders in tourism industry ( which has become one of the most important lucrative and employment-generating activities at the international level these days) are host communities in tourist destination which are affected and effect on this industry development. Recognizing host community and its segmentations can be important to get their support for future decisions and policy making. In order to identify these segments, in this study, clustering of the residents has been done by using some tools that are designed to encounter human complexities and have ability to model and generalize complex systems without any needs for the initial clusters’ seeds like classic methods. Neural networks can help to meet these expectations. The research have been planned to design neural networks-based mathematical model for clustering the host community effectively according to multi criteria, and identifies differences among segments. In order to achieve this goal, the residents’ segmentation has been done by demographic characteristics, their attitude towards the tourism development, the level of satisfaction and the type of their support in this field. The applied method is self-organized neural networks and the results have compared with K-means. As the results show, the use of Self- Organized Map (SOM) method provides much better results by considering the Cophenetic correlation and between clusters variance coefficients. Based on these criteria, the host community is divided into five sections with unique and distinctive features, which are in the best condition (in comparison other modes) according to Cophenetic correlation coefficient of 0.8769 and between clusters variance of 0.1412.Keywords: Artificial Nural Network, Clustering , Resident, SOM, Tourism
Procedia PDF Downloads 18318250 Social Value of Travel Time Savings in Sub-Saharan Africa
Authors: Richard Sogah
Abstract:
The significance of transport infrastructure investments for economic growth and development has been central to the World Bank’s strategy for poverty reduction. Among the conventional surface transport infrastructures, road infrastructure is significant in facilitating the movement of human capital goods and services. When transport projects (i.e., roads, super-highways) are implemented, they come along with some negative social values (costs), such as increased noise and air pollution for local residents living near these facilities, displaced individuals, etc. However, these projects also facilitate better utilization of existing capital stock and generate other observable benefits that can be easily quantified. For example, the improvement or construction of roads creates employment, stimulates revenue generation (toll), reduces vehicle operating costs and accidents, increases accessibility, trade expansion, safety improvement, etc. Aside from these benefits, travel time savings (TTSs) which are the major economic benefits of urban and inter-urban transport projects and therefore integral in the economic assessment of transport projects, are often overlooked and omitted when estimating the benefits of transport projects, especially in developing countries. The absence of current and reliable domestic travel data and the inability of replicated models from the developed world to capture the actual value of travel time savings due to the large unemployment, underemployment, and other labor-induced distortions has contributed to the failure to assign value to travel time savings when estimating the benefits of transport schemes in developing countries. This omission of the value of travel time savings from the benefits of transport projects in developing countries poses problems for investors and stakeholders to either accept or dismiss projects based on schemes that favor reduced vehicular operating costs and other parameters rather than those that ease congestion, increase average speed, facilitate walking and handloading, and thus save travel time. Given the complex reality in the estimation of the value of travel time savings and the presence of widespread informal labour activities in Sub-Saharan Africa, we construct a “nationally ranked distribution of time values” and estimate the value of travel time savings based on the area beneath the distribution. Compared with other approaches, our method captures both formal sector workers and individuals/people who work outside the formal sector and hence changes in their time allocation occur in the informal economy and household production activities. The dataset for the estimations is sourced from the World Bank, the International Labour Organization, etc.Keywords: road infrastructure, transport projects, travel time savings, congestion, Sub-Sahara Africa
Procedia PDF Downloads 11018249 Multishape Task Scheduling Algorithms for Real Time Micro-Controller Based Application
Authors: Ankur Jain, W. Wilfred Godfrey
Abstract:
Embedded systems are usually microcontroller-based systems that represent a class of reliable and dependable dedicated computer systems designed for specific purposes. Micro-controllers are used in most electronic devices in an endless variety of ways. Some micro-controller-based embedded systems are required to respond to external events in the shortest possible time and such systems are known as real-time embedded systems. So in multitasking system there is a need of task Scheduling,there are various scheduling algorithms like Fixed priority Scheduling(FPS),Earliest deadline first(EDF), Rate Monotonic(RM), Deadline Monotonic(DM),etc have been researched. In this Report various conventional algorithms have been reviewed and analyzed, these algorithms consists of single shape task, A new Multishape scheduling algorithms has been proposed and implemented and analyzed.Keywords: dm, edf, embedded systems, fixed priority, microcontroller, rtos, rm, scheduling algorithms
Procedia PDF Downloads 40418248 Rapid Identification and Diagnosis of the Pathogenic Leptospiras through Comparison among Culture, PCR and Real Time PCR Techniques from Samples of Human and Mouse Feces
Authors: S. Rostampour Yasouri, M. Ghane, M. Doudi
Abstract:
Leptospirosis is one of the most significant infectious and zoonotic diseases along with global spreading. This disease is causative agent of economoic losses and human fatalities in various countries, including Northern provinces of Iran. The aim of this research is to identify and compare the rapid diagnostic techniques of pathogenic leptospiras, considering the multifacetedness of the disease from a clinical manifestation and premature death of patients. In the spring and summer of 2020-2022, 25 fecal samples were collected from suspected leptospirosis patients and 25 Fecal samples from mice residing in the rice fields and factories in Tonekabon city. Samples were prepared by centrifugation and passing through membrane filters. Culture technique was used in liquid and solid EMJH media during one month of incubation at 30°C. Then, the media were examined microscopically. DNA extraction was conducted by extraction Kit. Diagnosis of leptospiras was enforced by PCR and Real time PCR (SYBR Green) techniques using lipL32 specific primer. Out of the patients, 11 samples (44%) and 8 samples (32%) were determined to be pathogenic Leptospira by Real time PCR and PCR technique, respectively. Out of the mice, 9 Samples (36%) and 3 samples (12%) were determined to be pathogenic Leptospira by the mentioned techniques, respectively. Although the culture technique is considered to be the gold standard technique, but due to the slow growth of pathogenic Leptospira and lack of colony formation of some species, it is not a fast technique. Real time PCR allowed rapid diagnosis with much higher accuracy compared to PCR because PCR could not completely identify samples with lower microbial load.Keywords: culture, pathogenic leptospiras, PCR, real time PCR
Procedia PDF Downloads 8518247 Analytical Solutions of Time Space Fractional, Advection-Dispersion and Whitham-Broer-Kaup Equations
Authors: Muhammad Danish Khan, Imran Naeem, Mudassar Imran
Abstract:
In this article, we study time-space Fractional Advection-Dispersion (FADE) equation and time-space Fractional Whitham-Broer-Kaup (FWBK) equation that have a significant role in hydrology. We introduce suitable transformations to convert fractional order derivatives to integer order derivatives and as a result these equations transform into Partial Differential Equations (PDEs). Then the Lie symmetries and corresponding optimal systems of the resulting PDEs are derived. The symmetry reductions and exact independent solutions based on optimal system are investigated which constitute the exact solutions of original fractional differential equations.Keywords: modified Riemann-Liouville fractional derivative, lie-symmetries, optimal system, invariant solutions
Procedia PDF Downloads 43118246 Message Authentication Scheme for Vehicular Ad-Hoc Networks under Sparse RSUs Environment
Authors: Wen Shyong Hsieh, Chih Hsueh Lin
Abstract:
In this paper, we combine the concepts of chameleon hash function (CHF) and identification based cryptography (IBC) to build a message authentication environment for VANET under sparse RSUs. Based on the CHF, TA keeps two common secrets that will be embedded to all identities to be as the evidence of mutual trusting. TA will issue one original identity to every RSU and vehicle. An identity contains one public ID and one private key. The public ID, includes three components: pseudonym, random key, and public key, is used to present one entity and can be verified to be a legal one. The private key is used to claim the ownership of the public ID. Based on the concept of IBC, without any negotiating process, a CHF pairing key multiplied by one private key and other’s public key will be used for mutually trusting and to be utilized as the session key of secure communicating between RSUs and vehicles. To help the vehicles to do message authenticating, the RSUs are assigned to response the vehicle’s temple identity request using two short time secretes that are broadcasted by TA. To light the loading of request information, one day is divided into M time slots. At every time slot, TA will broadcast two short time secretes to all valid RSUs for that time slot. Any RSU can response the temple identity request from legal vehicles. With the collected announcement of public IDs from the neighbor vehicles, a vehicle can set up its neighboring set, which includes the information about the neighbor vehicle’s temple public ID and temple CHF pairing key that can be derived by the private key and neighbor’s public key and will be used to do message authenticating or secure communicating without the help of RSU.Keywords: Internet of Vehicles (IOV), Vehicular Ad-hoc Networks (VANETs), Chameleon Hash Function (CHF), message authentication
Procedia PDF Downloads 39118245 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks
Authors: Afnan Al-Romi, Iman Al-Momani
Abstract:
The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN
Procedia PDF Downloads 32318244 MB-Slam: A Slam Framework for Construction Monitoring
Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han
Abstract:
Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.Keywords: perspective alignment, progress monitoring, slam, stereo matching.
Procedia PDF Downloads 22518243 An Approach for Determining and Reducing Vehicle Turnaround Time for Outbound Logistics by Using Critical Path Method
Authors: Prajakta M. Wazat, D. N. Raut
Abstract:
The study consists of a fast moving consumer goods (FMCG) beverage company wherein a portion of the supply chain which deals with outbound logistics is taken for improvement in order to reduce its logistics cost by using critical path method (CPM) method. Logistics is a major portion of the supply chain where customers are not willing to pay as it adds cost to product without adding value. In this study, it is necessary to ensure that products are delivered to clients at the right time while preserving high-quality standards from the beginning to the end of the supply chain. CPM is a logical sequencing method where in the most efficient route is achieved by arranging the series of events. CPM enables to identify a critical factor in order to minimize the delays and interruption by providing a feasible solution.Keywords: FMCG, supply chain, outbound logistics, vehicle turnaround time, critical path method, cost reduction
Procedia PDF Downloads 16518242 Spatial Temporal Change of COVID-19 Vaccination Condition in the US: An Exploration Based on Space Time Cube
Authors: Yue Hao
Abstract:
COVID-19 vaccines not only protect individuals but society as a whole. In this case, having an understanding of the change and trend of vaccination conditions may shed some light on revising and making up-to-date policies regarding large-scale public health promotions and calls in order to lead and encourage the adoption of COVID-19 vaccines. However, vaccination status change over time and vary from place to place hidden patterns that were not fully explored in previous research. In our research, we took advantage of the spatial-temporal analytical methods in the domain of geographic information science and captured the spatial-temporal changes regarding COVID-19 vaccination status in the United States during 2020 and 2021. After conducting the emerging hot spots analysis on both the state level data of the US and county level data of California we found that: (1) at the macroscopic level, there is a continuously increasing trend of the vaccination rate in the US, but there is a variance on the spatial clusters at county level; (2) spatial hotspots and clusters with high vaccination amount over time were clustered around the west and east coast in regions like California and New York City where are densely populated with considerable economy conditions; (3) in terms of the growing trend of the daily vaccination among, Los Angeles County alone has very high statistics and dramatic increases over time. We hope that our findings can be valuable guidance for supporting future decision-making regarding vaccination policies as well as directing new research on relevant topics.Keywords: COVID-19 vaccine, GIS, space time cube, spatial-temporal analysis
Procedia PDF Downloads 7918241 The Advancement of Environmental Impact Assessment for 5th Transmission Natural Gas Pipeline Project in Thailand
Authors: Penrug Pengsombut, Worawut Hamarn, Teerawuth Suwannasri, Kittiphong Songrukkiat, Kanatip Ratanachoo
Abstract:
PTT Public Company Limited or simply PTT has played an important role in strengthening national energy security of the Kingdom of Thailand by transporting natural gas to customers in power, industrial and commercial sectors since 1981. PTT has been constructing and operating natural gas pipeline system of over 4,500-km network length both onshore and offshore laid through different area classifications i.e., marine, forest, agriculture, rural, urban, and city areas. During project development phase, an Environmental Impact Assessment (EIA) is conducted and submitted to the Office of Natural Resources and Environmental Policy and Planning (ONEP) for approval before project construction commencement. Knowledge and experiences gained and revealed from EIA in the past projects definitely are developed to further advance EIA study process for newly 5th Transmission Natural Gas Pipeline Project (5TP) with approximately 415 kilometers length. The preferred pipeline route is selected and justified by SMARTi map, an advance digital one-map platform with consists of multiple layers geographic and environmental information. Sensitive area impact focus (SAIF) is a practicable impact assessment methodology which appropriate for a particular long distance infrastructure project such as 5TP. An environmental modeling simulation is adopted into SAIF methodology for impact quantified in all sensitive areas whereas other area along pipeline right-of-ways is typically assessed as an impact representative. Resulting time and cost deduction is beneficial to project for early start.Keywords: environmental impact assessment, EIA, natural gas pipeline, sensitive area impact focus, SAIF
Procedia PDF Downloads 40818240 Improving Pneumatic Artificial Muscle Performance Using Surrogate Model: Roles of Operating Pressure and Tube Diameter
Authors: Van-Thanh Ho, Jaiyoung Ryu
Abstract:
In soft robotics, the optimization of fluid dynamics through pneumatic methods plays a pivotal role in enhancing operational efficiency and reducing energy loss. This is particularly crucial when replacing conventional techniques such as cable-driven electromechanical systems. The pneumatic model employed in this study represents a sophisticated framework designed to efficiently channel pressure from a high-pressure reservoir to various muscle locations on the robot's body. This intricate network involves a branching system of tubes. The study introduces a comprehensive pneumatic model, encompassing the components of a reservoir, tubes, and Pneumatically Actuated Muscles (PAM). The development of this model is rooted in the principles of shock tube theory. Notably, the study leverages experimental data to enhance the understanding of the interplay between the PAM structure and the surrounding fluid. This improved interactive approach involves the use of morphing motion, guided by a contraction function. The study's findings demonstrate a high degree of accuracy in predicting pressure distribution within the PAM. The model's predictive capabilities ensure that the error in comparison to experimental data remains below a threshold of 10%. Additionally, the research employs a machine learning model, specifically a surrogate model based on the Kriging method, to assess and quantify uncertainty factors related to the initial reservoir pressure and tube diameter. This comprehensive approach enhances our understanding of pneumatic soft robotics and its potential for improved operational efficiency.Keywords: pneumatic artificial muscles, pressure drop, morhing motion, branched network, surrogate model
Procedia PDF Downloads 9818239 Forecasting of COVID-19 Cases, Hospitalization Admissions, and Death Cases Based on Wastewater Sars-COV-2 Surveillance Using Copula Time Series Model
Authors: Hueiwang Anna Jeng, Norou Diawara, Nancy Welch, Cynthia Jackson, Rekha Singh, Kyle Curtis, Raul Gonzalez, David Jurgens, Sasanka Adikari
Abstract:
Modeling effort is needed to predict the COVID-19 trends for developing management strategies and adaptation measures. The objective of this study was to assess whether SARS-CoV-2 viral load in wastewater could serve as a predictor for forecasting COVID-19 cases, hospitalization cases, and death cases using copula-based time series modeling. SARS-CoV-2 RNA load in raw wastewater in Chesapeake VA was measured using the RT-qPCR method. Gaussian copula time series marginal regression model, incorporating an autoregressive moving average model and the copula function, served as a forecasting model. COVID-19 cases were correlated with wastewater viral load, hospitalization cases, and death cases. The forecasted trend of COVID-19 cases closely paralleled one of the reported cases, with over 90% of the forecasted COVID-19 cases falling within the 99% confidence interval of the reported cases. Wastewater SARS-CoV-2 viral load could serve as a predictor for COVID-19 cases and hospitalization cases.Keywords: COVID-19, modeling, time series, copula function
Procedia PDF Downloads 6918238 The Role of Information Technology in Supply Chain Management
Authors: V. Jagadeesh, K. Venkata Subbaiah, P. Govinda Rao
Abstract:
This paper explaining about the significance of information technology tools and software packages in supply chain management (SCM) in order to manage the entire supply chain. Managing materials flow and financial flow and information flow effectively and efficiently with the aid of information technology tools and packages in order to deliver right quantity with right quality of goods at right time by using right methods and technology. Information technology plays a vital role in streamlining the sales forecasting and demand planning and Inventory control and transportation in supply networks and finally deals with production planning and scheduling. It achieves the objectives by streamlining the business process and integrates within the enterprise and its extended enterprise. SCM starts with customer and it involves sequence of activities from customer, retailer, distributor, manufacturer and supplier within the supply chain framework. It is the process of integrating demand planning and supply network planning and production planning and control. Forecasting indicates the direction for planning raw materials in order to meet the production planning requirements. Inventory control and transportation planning allocate the optimal or economic order quantity by utilizing shortest possible routes to deliver the goods to the customer. Production planning and control utilize the optimal resources mix in order to meet the capacity requirement planning. The above operations can be achieved by using appropriate information technology tools and software packages for the supply chain management.Keywords: supply chain management, information technology, business process, extended enterprise
Procedia PDF Downloads 37718237 A High Time Resolution Digital Pulse Width Modulator Based on Field Programmable Gate Array’s Phase Locked Loop Megafunction
Authors: Jun Wang, Tingcun Wei
Abstract:
The digital pulse width modulator (DPWM) is the crucial building block for digitally-controlled DC-DC switching converter, which converts the digital duty ratio signal into its analog counterpart to control the power MOSFET transistors on or off. With the increase of switching frequency of digitally-controlled DC-DC converter, the DPWM with higher time resolution is required. In this paper, a 15-bits DPWM with three-level hybrid structure is presented; the first level is composed of a7-bits counter and a comparator, the second one is a 5-bits delay line, and the third one is a 3-bits digital dither. The presented DPWM is designed and implemented using the PLL megafunction of FPGA (Field Programmable Gate Arrays), and the required frequency of clock signal is 128 times of switching frequency. The simulation results show that, for the switching frequency of 2 MHz, a DPWM which has the time resolution of 15 ps is achieved using a maximum clock frequency of 256MHz. The designed DPWM in this paper is especially useful for high-frequency digitally-controlled DC-DC switching converters.Keywords: DPWM, digitally-controlled DC-DC switching converter, FPGA, PLL megafunction, time resolution
Procedia PDF Downloads 48018236 Application of Lean Six Sigma Tools to Minimize Time and Cost in Furniture Packaging
Authors: Suleiman Obeidat, Nabeel Mandahawi
Abstract:
In this work, the packaging process for a move is improved. The customers of this move need their household stuff to be moved from their current house to the new one with minimum damage, in an organized manner, on time and with the minimum cost. Our goal was to improve the process between 10% and 20% time efficiency, 90% reduction in damaged parts and an acceptable improvement in the cost of the total move process. The expected ROI was 833%. Many improvement techniques have been used in terms of the way the boxes are prepared, their preparation cost, packing the goods, labeling them and moving them to a place for moving out. DMAIC technique is used in this work: SIPOC diagram, value stream map of “As Is” process, Root Cause Analysis, Maps of “Future State” and “Ideal State” and an Improvement Plan. A value of ROI=624% is obtained which is lower than the expected value of 833%. The work explains the techniques of improvement and the deficiencies in the old process.Keywords: packaging, lean tools, six sigma, DMAIC methodology, SIPOC
Procedia PDF Downloads 42818235 Comparison of Deep Learning and Machine Learning Algorithms to Diagnose and Predict Breast Cancer
Authors: F. Ghazalnaz Sharifonnasabi, Iman Makhdoom
Abstract:
Breast cancer is a serious health concern that affects many people around the world. According to a study published in the Breast journal, the global burden of breast cancer is expected to increase significantly over the next few decades. The number of deaths from breast cancer has been increasing over the years, but the age-standardized mortality rate has decreased in some countries. It’s important to be aware of the risk factors for breast cancer and to get regular check- ups to catch it early if it does occur. Machin learning techniques have been used to aid in the early detection and diagnosis of breast cancer. These techniques, that have been shown to be effective in predicting and diagnosing the disease, have become a research hotspot. In this study, we consider two deep learning approaches including: Multi-Layer Perceptron (MLP), and Convolutional Neural Network (CNN). We also considered the five-machine learning algorithm titled: Decision Tree (C4.5), Naïve Bayesian (NB), Support Vector Machine (SVM), K-Nearest Neighbors (KNN) Algorithm and XGBoost (eXtreme Gradient Boosting) on the Breast Cancer Wisconsin Diagnostic dataset. We have carried out the process of evaluating and comparing classifiers involving selecting appropriate metrics to evaluate classifier performance and selecting an appropriate tool to quantify this performance. The main purpose of the study is predicting and diagnosis breast cancer, applying the mentioned algorithms and also discovering of the most effective with respect to confusion matrix, accuracy and precision. It is realized that CNN outperformed all other classifiers and achieved the highest accuracy (0.982456). The work is implemented in the Anaconda environment based on Python programing language.Keywords: breast cancer, multi-layer perceptron, Naïve Bayesian, SVM, decision tree, convolutional neural network, XGBoost, KNN
Procedia PDF Downloads 7518234 A Hybrid Algorithm for Collaborative Transportation Planning among Carriers
Authors: Elham Jelodari Mamaghani, Christian Prins, Haoxun Chen
Abstract:
In this paper, there is concentration on collaborative transportation planning (CTP) among multiple carriers with pickup and delivery requests and time windows. This problem is a vehicle routing problem with constraints from standard vehicle routing problems and new constraints from a real-world application. In the problem, each carrier has a finite number of vehicles, and each request is a pickup and delivery request with time window. Moreover, each carrier has reserved requests, which must be served by itself, whereas its exchangeable requests can be outsourced to and served by other carriers. This collaboration among carriers can help them to reduce total transportation costs. A mixed integer programming model is proposed to the problem. To solve the model, a hybrid algorithm that combines Genetic Algorithm and Simulated Annealing (GASA) is proposed. This algorithm takes advantages of GASA at the same time. After tuning the parameters of the algorithm with the Taguchi method, the experiments are conducted and experimental results are provided for the hybrid algorithm. The results are compared with those obtained by a commercial solver. The comparison indicates that the GASA significantly outperforms the commercial solver.Keywords: centralized collaborative transportation, collaborative transportation with pickup and delivery, collaborative transportation with time windows, hybrid algorithm of GA and SA
Procedia PDF Downloads 39218233 IoT and Advanced Analytics Integration in Biogas Modelling
Authors: Rakesh Choudhary, Ajay Kumar, Deepak Sharma
Abstract:
The main goal of this paper is to investigate the challenges and benefits of IoT integration in biogas production. This overview explains how the inclusion of IoT can enhance biogas production efficiency. Therefore, such collected data can be explored by advanced analytics, including Artificial intelligence (AI) and Machine Learning (ML) algorithms, consequently improving bio-energy processes. To boost biogas generation efficiency, this report examines the use of IoT devices for real-time data collection on key parameters, e.g., pH, temperature, gas composition, and microbial growth. Real-time monitoring through big data has made it possible to detect diverse, complex trends in the process of producing biogas. The Informed by advanced analytics can also help in improving bio-energy production as well as optimizing operational conditions. Moreover, IoT allows remote observation, control and management, which decreases manual intervention needed whilst increasing process effectiveness. Such a paradigm shift in the incorporation of IoT technologies into biogas production systems helps to achieve higher productivity levels as well as more practical biomass quality biomethane through real-time monitoring-based proactive decision-making, thus driving continuous performance improvement.Keywords: internet of things, biogas, renewable energy, sustainability, anaerobic digestion, real-time monitoring, optimization
Procedia PDF Downloads 20