Search results for: intelligent automation
234 Emotional Intelligence Training: Helping Non-Native Pre-Service EFL Teachers to Overcome Speaking Anxiety: The Case of Pre-Service Teachers of English, Algeria
Authors: Khiari Nor El Houda, Hiouani Amira Sarra
Abstract:
Many EFL students with high capacities are hidden because they suffer from speaking anxiety (SA). Most of them find public speaking much demanding. They feel unable to communicate, they fear to make mistakes and they fear negative evaluation or being called on. With the growing number of the learners who suffer from foreign language speaking anxiety (FLSA), it is becoming increasingly difficult to ignore its harmful outcomes on their performance and success, especially during their first contact with the pupils, as they will be teaching in the near future. Different researchers suggested different ways to minimize the negative effects of FLSA. The present study sheds light on emotional intelligence skills training as an effective strategy not only to influence public speaking success but also to help pre-service EFL teachers lessen their speaking anxiety and eventually to prepare them for their professional career. A quasi-experiment was used in order to examine the research hypothesis. We worked with two groups of third-year EFL students at Oum El Bouaghi University. The Foreign Language Classroom Anxiety Scale (FLCAS) and the Emotional Quotient Inventory (EQ-i) were used to collect data about the participants’ FLSA and EI levels. The analysis of the data has yielded that the assumption that there is a negative correlation between EI and FLSA was statistically validated by the Pearson Correlation Test, concluding that, the more emotionally intelligent the individual is the less anxious s/he will be. In addition, the lack of amelioration in the results of the control group and the noteworthy improvement in the experimental group results led us to conclude that EI skills training was an effective strategy in minimizing the FLSA level and therefore, we confirmed our research hypothesis.Keywords: emotional intelligence, emotional intelligence skills training, EQ-I, FLCAS, foreign language speaking anxiety, pre-service EFL teachers
Procedia PDF Downloads 143233 Shape Management Method of Large Structure Based on Octree Space Partitioning
Authors: Gichun Cha, Changgil Lee, Seunghee Park
Abstract:
The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning
Procedia PDF Downloads 298232 Use of Metallic and Bimetallic Nanostructures as Constituents of Active Bio-Based Films
Authors: Lina F. Ballesteros, Hafsae Lamsaf, Miguel A. Cerqueira, Lorenzo M. Pastrana, Sandra Carvalho, Jose A. Teixeira, S. Calderon V.
Abstract:
The use of bio-based packaging materials containing metallic and bimetallic nanostructures is relatively modern technology. In this sense, the food packaging industry has been investigating biological and renewable resources that can replace petroleum-based materials to reduce the environmental impact and, at the same time, including new functionalities using nanotechnology. Therefore, the main objective of the present work consisted of developing bio-based poly-lactic acid (PLA) films with Zinc (Zn) and Zinc-Iron (Zn-Fe) nanostructures deposited by magnetron sputtering. The structural, antimicrobial, and optical properties of the films were evaluated when exposed at 60% and 96% relative humidity (RH). The morphology and elemental analysis of the samples were determined by scanning (transmission) electron microscopy (SEM and STEM), and inductively coupled plasma optical emission spectroscopy (ICP-OES). The structure of the PLA was monitored before and after deposition by Fourier transform infrared spectroscopy (FTIR) analysis, and the antimicrobial and color assays were performed by using the zone of inhibition (ZOI) test and a Minolta colorimeter, respectively. Finally, the films were correlated in terms of the deposit conditions, Zn or Zn-Fe concentrations, and thickness. The results revealed PLA films with different morphologies, compositions, and thicknesses of Zn or Zn-Fe nanostructures. The samples showed a significant antibacterial and antifungal activity against E. coli, P. aeruginosa, P. fluorescens, S. aureus, and A. niger, and considerable changes of color and opacity at 96% RH, especially for the thinner nanostructures (150-250 nm). On the other hand, when the Fe fraction was increased, the lightness of samples increased, as well as their antimicrobial activity when compared to the films with pure Zn. Hence, these findings are relevant to the food packaging field since intelligent and active films with multiple properties can be developed.Keywords: biopolymers, functional properties, magnetron sputtering, Zn and Zn-Fe nanostructures
Procedia PDF Downloads 127231 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain
Authors: M. Pushparani, A. Sagaya
Abstract:
Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.Keywords: embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems
Procedia PDF Downloads 291230 Economic Development Impacts of Connected and Automated Vehicles (CAV)
Authors: Rimon Rafiah
Abstract:
This paper will present a combination of two seemingly unrelated models, which are the one for estimating economic development impacts as a result of transportation investment and the other for increasing CAV penetration in order to reduce congestion. Measuring economic development impacts resulting from transportation investments is becoming more recognized around the world. Examples include the UK’s Wider Economic Benefits (WEB) model, Economic Impact Assessments in the USA, various input-output models, and additional models around the world. The economic impact model is based on WEB and is based on the following premise: investments in transportation will reduce the cost of personal travel, enabling firms to be more competitive, creating additional throughput (the same road allows more people to travel), and reducing the cost of travel of workers to a new workplace. This reduction in travel costs was estimated in out-of-pocket terms in a given localized area and was then translated into additional employment based on regional labor supply elasticity. This additional employment was conservatively assumed to be at minimum wage levels, translated into GDP terms, and from there into direct taxation (i.e., an increase in tax taken by the government). The CAV model is based on economic principles such as CAV usage, supply, and demand. Usage of CAVs can increase capacity using a variety of means – increased automation (known as Level I thru Level IV) and also by increased penetration and usage, which has been predicted to go up to 50% by 2030 according to several forecasts, with possible full conversion by 2045-2050. Several countries have passed policies and/or legislation on sales of gasoline-powered vehicles (none) starting in 2030 and later. Supply was measured via increased capacity on given infrastructure as a function of both CAV penetration and implemented technologies. The CAV model, as implemented in the USA, has shown significant savings in travel time and also in vehicle operating costs, which can be translated into economic development impacts in terms of job creation, GDP growth and salaries as well. The models have policy implications as well and can be adapted for use in Japan as well.Keywords: CAV, economic development, WEB, transport economics
Procedia PDF Downloads 77229 Analysis of Two-Echelon Supply Chain with Perishable Items under Stochastic Demand
Authors: Saeed Poormoaied
Abstract:
Perishability and developing an intelligent control policy for perishable items are the major concerns of marketing managers in a supply chain. In this study, we address a two-echelon supply chain problem for perishable items with a single vendor and a single buyer. The buyer adopts an aged-based continuous review policy which works by taking both the stock level and the aging process of items into account. The vendor works under the warehouse framework, where its lot size is determined with respect to the batch size of the buyer. The model holds for a positive and fixed lead time for the buyer, and zero lead time for the vendor. The demand follows a Poisson process and any unmet demand is lost. We provide exact analytic expressions for the operational characteristics of the system by using the renewal reward theorem. Items have a fixed lifetime after which they become unusable and are disposed of from the buyer's system. The age of items starts when they are unpacked and ready for the consumption at the buyer. When items are held by the vendor, there is no aging process which results in no perishing at the vendor's site. The model is developed under the centralized framework, which takes the expected profit of both vendor and buyer into consideration. The goal is to determine the optimal policy parameters under the service level constraint at the retailer's site. A sensitivity analysis is performed to investigate the effect of the key input parameters on the expected profit and order quantity in the supply chain. The efficiency of the proposed age-based policy is also evaluated through a numerical study. Our results show that when the unit perishing cost is negligible, a significant cost saving is achieved.Keywords: two-echelon supply chain, perishable items, age-based policy, renewal reward theorem
Procedia PDF Downloads 147228 Software Development to Empowering Digital Libraries with Effortless Digital Cataloging and Access
Authors: Abdul Basit Kiani
Abstract:
The software for the digital library system is a cutting-edge solution designed to revolutionize the way libraries manage and provide access to their vast collections of digital content. This advanced software leverages the power of technology to offer a seamless and user-friendly experience for both library staff and patrons. By implementing this software, libraries can efficiently organize, store, and retrieve digital resources, including e-books, audiobooks, journals, articles, and multimedia content. Its intuitive interface allows library staff to effortlessly manage cataloging, metadata extraction, and content enrichment, ensuring accurate and comprehensive access to digital materials. For patrons, the software offers a personalized and immersive digital library experience. They can easily browse the digital catalog, search for specific items, and explore related content through intelligent recommendation algorithms. The software also facilitates seamless borrowing, lending, and preservation of digital items, enabling users to access their favorite resources anytime, anywhere, on multiple devices. With robust security features, the software ensures the protection of intellectual property rights and enforces access controls to safeguard sensitive content. Integration with external authentication systems and user management tools streamlines the library's administration processes, while advanced analytics provide valuable insights into patron behavior and content usage. Overall, this software for the digital library system empowers libraries to embrace the digital era, offering enhanced access, convenience, and discoverability of their vast collections. It paves the way for a more inclusive and engaging library experience, catering to the evolving needs of tech-savvy patrons.Keywords: software development, empowering digital libraries, digital cataloging and access, management system
Procedia PDF Downloads 88227 A Cross-Cultural Approach for Communication with Biological and Non-Biological Intelligences
Authors: Thomas Schalow
Abstract:
This paper posits the need to take a cross-cultural approach to communication with non-human cultures and intelligences in order to meet the following three imminent contingencies: communicating with sentient biological intelligences, communicating with extraterrestrial intelligences, and communicating with artificial super-intelligences. The paper begins with a discussion of how intelligence emerges. It disputes some common assumptions we maintain about consciousness, intention, and language. The paper next explores cross-cultural communication among humans, including non-sapiens species. The next argument made is that we need to become much more serious about communicating with the non-human, intelligent life forms that already exist around us here on Earth. There is an urgent need to broaden our definition of communication and reach out to the other sentient life forms that inhabit our world. The paper next examines the science and philosophy behind CETI (communication with extraterrestrial intelligences) and how it has proven useful, even in the absence of contact with alien life. However, CETI’s assumptions and methodology need to be revised and based on the cross-cultural approach to communication proposed in this paper if we are truly serious about finding and communicating with life beyond Earth. The final theme explored in this paper is communication with non-biological super-intelligences using a cross-cultural communication approach. This will present a serious challenge for humanity, as we have never been truly compelled to converse with other species, and our failure to seriously consider such intercourse has left us largely unprepared to deal with communication in a future that will be mediated and controlled by computer algorithms. Fortunately, our experience dealing with other human cultures can provide us with a framework for this communication. The basic assumptions behind intercultural communication can be applied to the many types of communication envisioned in this paper if we are willing to recognize that we are in fact dealing with other cultures when we interact with other species, alien life, and artificial super-intelligence. The ideas considered in this paper will require a new mindset for humanity, but a new disposition will prepare us to face the challenges posed by a future dominated by artificial intelligence.Keywords: artificial intelligence, CETI, communication, culture, language
Procedia PDF Downloads 361226 In-vitro Metabolic Fingerprinting Using Plasmonic Chips by Laser Desorption/Ionization Mass Spectrometry
Authors: Vadanasundari Vedarethinam, Kun Qian
Abstract:
The metabolic analysis is more distal over proteomics and genomics engaging in clinics and needs rationally distinct techniques, designed materials, and device for clinical diagnosis. Conventional techniques such as spectroscopic techniques, biochemical analyzers, and electrochemical have been used for metabolic diagnosis. Currently, there are four major challenges including (I) long-term process in sample pretreatment; (II) difficulties in direct metabolic analysis of biosamples due to complexity (III) low molecular weight metabolite detection with accuracy and (IV) construction of diagnostic tools by materials and device-based platforms for real case application in biomedical applications. Development of chips with nanomaterial is promising to address these critical issues. Mass spectroscopy (MS) has displayed high sensitivity and accuracy, throughput, reproducibility, and resolution for molecular analysis. Particularly laser desorption/ ionization mass spectrometry (LDI MS) combined with devices affords desirable speed for mass measurement in seconds and high sensitivity with low cost towards large scale uses. We developed a plasmonic chip for clinical metabolic fingerprinting as a hot carrier in LDI MS by series of chips with gold nanoshells on the surface through controlled particle synthesis, dip-coating, and gold sputtering for mass production. We integrated the optimized chip with microarrays for laboratory automation and nanoscaled experiments, which afforded direct high-performance metabolic fingerprinting by LDI MS using 500 nL of serum, urine, cerebrospinal fluids (CSF) and exosomes. Further, we demonstrated on-chip direct in-vitro metabolic diagnosis of early-stage lung cancer patients using serum and exosomes without any pretreatment or purifications. To our best knowledge, this work initiates a bionanotechnology based platform for advanced metabolic analysis toward large-scale diagnostic use.Keywords: plasmonic chip, metabolic fingerprinting, LDI MS, in-vitro diagnostics
Procedia PDF Downloads 166225 Adaptation of Smart City Concept in Africa: Localization, Relevance and Bottleneck
Authors: Adeleye Johnson Adelagunayeja
Abstract:
The concept of making cities, communities, and neighborhoods smart, intelligent, and responsive is relatively new to Africa and its urban renewal agencies. Efforts must be made by relevant agencies to begin a holistic review of the implementation of infrastructural facilities and urban renewal methodologies that will revolve around the appreciation and application of artificial intelligence. The propagation of the ideals and benefits of the smart city concept are key factors that can encourage governments of African nations, the African Union, and other regional organizations in Africa to embrace the ideology. The ability of this smart city concept to curb insecurities – armed robbery, assassination, terrorism, and civil disorder – is one major reason, amongst others, why African governments must speedily embrace this contemporary developmental concept whose time has come! The seamlessness to access information and virtually cross-pollinate ideas with people living in already established smart cities, when combined with the great efficiency that the emergence of smart cities brings with it, are other reasons why Africa must come up with action plans that can enable the existing cities to metamorphose into smart cities. Innovations will be required to enable Africa to develop a smart city concept that will be compatible with the basic patterns of livelihood because the essence of the smart city evolution is to make life better for people to co-exist, to be productive and to enjoy standard infrastructural facilities. This research paper enumerates the multifaceted adaptive factors that have the potentials of making the adoption of smartcity concept in Africa seamless. It also proffers solutions to potential bottlenecks capable of undermining the execution of the smart city concept in Africa.Keywords: smartcity compactibility innovation Africa government evolution, Africa as global village member, evolution in Africa, ways to make Africa adopt smartcity, localizing smartcity concept in Africa, bottleneck to smartcity developmet in Africa
Procedia PDF Downloads 88224 The Relationship of Lean Management Principles with Lean Maturity Levels: Multiple Case Study in Manufacturing Companies
Authors: Alexandre D. Ferraz, Dario H. Alliprandini, Mauro Sampaio
Abstract:
Companies and other institutions are constantly seeking better organizational performance and greater competitiveness. In order to fulfill this purpose, there are many tools, methodologies and models for increasing performance. However, the Lean Management approach seems to be the most effective in terms of achieving a significant improvement in productivity relatively quickly. Although Lean tools are relatively easy to understand and implement in different contexts, many organizations are not able to transform themselves into 'Lean companies'. Most of the efforts in its implementation have shown single benefits, failing to achieve the desired impact on the performance of the overall enterprise system. There is also a growing perception of the importance of management in Lean transformation, but few studies have empirically investigated and described the 'Lean Management'. In order to understand more clearly the ideas that guide Lean Management and its influence on the maturity level of the production system, the objective of this research is analyze the relationship between the Lean Management principles and the Lean maturity level in the organizations. The research also analyzes the principles of Lean Management and its relationship with the 'Lean culture' and the results obtained. The research was developed using the case study methodology. Three manufacturing units of a German multinational company from industrial automation segment, located in different countries were studied, in order to have a better comparison between the practices and the level of maturity in the implementation. The primary source of information was the application of a research questionnaire based on the theoretical review. The research showed that higher the level of Lean Management principles, higher are the Lean maturity level, the Lean culture level, and the level of Lean results obtained in the organization. The research also showed that factors such as time for application of Lean concepts and company size were not determinant for the level of Lean Management principles and, consequently, for the level of Lean maturity in the organization. The characteristics of the production system showed much more influence in different evaluated aspects. The present research also left recommendations for the managers of the plants analyzed and suggestions for future research.Keywords: lean management, lean principles, lean maturity level, lean manufacturing
Procedia PDF Downloads 148223 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 263222 Save Lives: The Application of Geolocation-Awareness Service in Iranian Pre-hospital EMS Information Management System
Authors: Somayeh Abedian, Pirhossein Kolivand, Hamid Reza Lornejad, Amin Karampour, Ebrahim Keshavarz Safari
Abstract:
For emergency and relief service providers such as pre-hospital emergencies, quick arrival at the scene of an accident or any EMS mission is one of the most important requirements of effective service delivery. Response time (the interval between the time of the call and the time of arrival on scene) is a critical factor in determining the quality of pre-hospital Emergency Medical Services (EMS). This is especially important for heart attack, stroke, or accident patients. Location-based e-services can be broadly defined as any service that provides information pertinent to the current location of an active mobile handset or precise address of landline phone call at a specific time window, regardless of the underlying delivery technology used to convey the information. According to research, one of the effective methods of meeting this goal is determining the location of the caller via the cooperation of landline and mobile phone operators in the country. The follow-up of the Communications Regulatory Authority (CRA) organization has resulted in the receipt of two separate secured electronic web services. Thus, to ensure human privacy, a secure technical architecture was required for launching the services in the pre-hospital EMS information management system. In addition, to quicken medics’ arrival at the patient's bedside, rescue vehicles should make use of an intelligent transportation system to estimate road traffic using a GPS-based mobile navigation system independent of the Internet. This paper seeks to illustrate the architecture of the practical national model used by the Iranian EMS organization.Keywords: response time, geographic location inquiry service (GLIS), location-based service (LBS), emergency medical services information system (EMSIS)
Procedia PDF Downloads 173221 Improving Fingerprinting-Based Localization System Using Generative AI
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. It also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 63220 Discussion on the Impact and Improvement Strategy of Bike Sharing on Urban Space
Authors: Bingying Liu, Dandong Ge, Xinlan Zhang, Haoyang Liang
Abstract:
Over the past two years, a new generation of No-Pile Bike sharing, represented by the Ofo, Mobike and HelloBike, has sprung up in various cities in China, and spread rapidly in countries such as Britain, Japan, the United States and Singapore. As a new green public transportation mode, bike sharing can bring a series of benefits to urban space. At first, this paper analyzes the specific impact of bike sharing on urban space in China. Based on the market research and data analyzing, it is found that bike sharing can improve the quality of urban space in three aspects: expanding the radius of public transportation service, filling service blind spots, alleviating urban traffic congestion, and enhancing the vitality of urban space. On the other hand, due to the immature market and the imperfect system, bike sharing has gradually revealed some difficulties, such as parking chaos, malicious damage, safety problems, imbalance between supply and demand, and so on. Then the paper investigates the characteristics of shared bikes, business model, operating mechanism on Chinese market currently. Finally, in order to make bike sharing serve urban construction better, this paper puts forward some specific countermeasures from four aspects. In terms of market operations, it is necessary to establish a public-private partnership model and set up a unified bike-sharing integrated management platform. From technical methods level, the paper proposes to develop an intelligent parking system for regulating parking. From policy formulation level, establishing a bike-sharing assessment mechanism would strengthen supervision. As to urban planning, sharing data and redesigning slow roadway is beneficial for transportation and spatial planning.Keywords: bike sharing, impact analysis, improvement strategy, urban space
Procedia PDF Downloads 173219 Use Cloud-Based Watson Deep Learning Platform to Train Models Faster and More Accurate
Authors: Susan Diamond
Abstract:
Machine Learning workloads have traditionally been run in high-performance computing (HPC) environments, where users log in to dedicated machines and utilize the attached GPUs to run training jobs on huge datasets. Training of large neural network models is very resource intensive, and even after exploiting parallelism and accelerators such as GPUs, a single training job can still take days. Consequently, the cost of hardware is a barrier to entry. Even when upfront cost is not a concern, the lead time to set up such an HPC environment takes months from acquiring hardware to set up the hardware with the right set of firmware, software installed and configured. Furthermore, scalability is hard to achieve in a rigid traditional lab environment. Therefore, it is slow to react to the dynamic change in the artificial intelligent industry. Watson Deep Learning as a service, a cloud-based deep learning platform that mitigates the long lead time and high upfront investment in hardware. It enables robust and scalable sharing of resources among the teams in an organization. It is designed for on-demand cloud environments. Providing a similar user experience in a multi-tenant cloud environment comes with its own unique challenges regarding fault tolerance, performance, and security. Watson Deep Learning as a service tackles these challenges and present a deep learning stack for the cloud environments in a secure, scalable and fault-tolerant manner. It supports a wide range of deep-learning frameworks such as Tensorflow, PyTorch, Caffe, Torch, Theano, and MXNet etc. These frameworks reduce the effort and skillset required to design, train, and use deep learning models. Deep Learning as a service is used at IBM by AI researchers in areas including machine translation, computer vision, and healthcare.Keywords: deep learning, machine learning, cognitive computing, model training
Procedia PDF Downloads 211218 Telemedicine App Powered by AI
Authors: Cotran Mabeya
Abstract:
This focuses on an artificially intelligent telemedicine application that aims to enrich the access to health care services, especially for those who live in remote and underserved areas. This app is highly packed with very advanced AI technologies—symptom checkers and virtual consultations—as well as health data integration for very efficient and user-friendly remote health support with main features: AI-based diagnostics, real-time health monitoring through wearables, and an intuitive interface. The Telemedicine Application tries too hard to address some of the healthcare problems, such as limited access in remote areas, high costs, lengthy wait times for certain services, as well as difficulty in getting second opinions. By making it friendlier for consultation remotely, the application removes geographic and financial barriers to accessing affordable and timely medical care. In addition, by having centralized patient records and communication between healthcare providers, it allows continuity of care by making it easier to transition to treatment. It has been confirmed that this multi-design approach incorporated both quantitative and qualitative designs to evaluate the socio-economic impacts of artificial intelligence and telemedicine on patients in Nairobi County. Adults made up the target population, while informers and respondents were categorized into patients, healthcare providers, and specialists in law, IT, and AI. Stratified and simple random sampling techniques were used to ensure diversely inclusive representation to enhance accuracy and triangulation in the data collected. Moreover, the study provides several recommendations, which include regular updating accuracy of AI symptom checkers, improving data security through encryption and multi-factor authentication, as well as real-time health data integration from bodily wearables for personal healthcareKeywords: artificial intelligence, virtual consultations, user-friendly, remote areas
Procedia PDF Downloads 11217 Reinforcement Learning For Agile CNC Manufacturing: Optimizing Configurations And Sequencing
Authors: Huan Ting Liao
Abstract:
In a typical manufacturing environment, computer numerical control (CNC) machining is essential for automating production through precise computer-controlled tool operations, significantly enhancing efficiency and ensuring consistent product quality. However, traditional CNC production lines often rely on manual loading and unloading, limiting operational efficiency and scalability. Although automated loading systems have been developed, they frequently lack sufficient intelligence and configuration efficiency, requiring extensive setup adjustments for different products and impacting overall productivity. This research addresses the job shop scheduling problem (JSSP) in CNC machining environments, aiming to minimize total completion time (makespan) and maximize CNC machine utilization. We propose a novel approach using reinforcement learning (RL), specifically the Q-learning algorithm, to optimize scheduling decisions. The study simulates the JSSP, incorporating robotic arm operations, machine processing times, and work order demand allocation to determine optimal processing sequences. The Q-learning algorithm enhances machine utilization by dynamically balancing workloads across CNC machines, adapting to varying job demands and machine states. This approach offers robust solutions for complex manufacturing environments by automating decision-making processes for job assignments. Additionally, we evaluate various layout configurations to identify the most efficient setup. By integrating RL-based scheduling optimization with layout analysis, this research aims to provide a comprehensive solution for improving manufacturing efficiency and productivity in CNC-based job shops. The proposed method's adaptability and automation potential promise significant advancements in tackling dynamic manufacturing challenges.Keywords: job shop scheduling problem, reinforcement learning, operations sequence, layout optimization, q-learning
Procedia PDF Downloads 30216 Development of an Optimised, Automated Multidimensional Model for Supply Chains
Authors: Safaa H. Sindi, Michael Roe
Abstract:
This project divides supply chain (SC) models into seven Eras, according to the evolution of the market’s needs throughout time. The five earliest Eras describe the emergence of supply chains, while the last two Eras are to be created. Research objectives: The aim is to generate the two latest Eras with their respective models that focus on the consumable goods. Era Six contains the Optimal Multidimensional Matrix (OMM) that incorporates most characteristics of the SC and allocates them into four quarters (Agile, Lean, Leagile, and Basic SC). This will help companies, especially (SMEs) plan their optimal SC route. Era Seven creates an Automated Multidimensional Model (AMM) which upgrades the matrix of Era six, as it accounts for all the supply chain factors (i.e. Offshoring, sourcing, risk) into an interactive system with Heuristic Learning that helps larger companies and industries to select the best SC model for their market. Methodologies: The data collection is based on a Fuzzy-Delphi study that analyses statements using Fuzzy Logic. The first round of Delphi study will contain statements (fuzzy rules) about the matrix of Era six. The second round of Delphi contains the feedback given from the first round and so on. Preliminary findings: both models are applicable, Matrix of Era six reduces the complexity of choosing the best SC model for SMEs by helping them identify the best strategy of Basic SC, Lean, Agile and Leagile SC; that’s tailored to their needs. The interactive heuristic learning in the AMM of Era seven will help mitigate error and aid large companies to identify and re-strategize the best SC model and distribution system for their market and commodity, hence increasing efficiency. Potential contributions to the literature: The problematic issue facing many companies is to decide which SC model or strategy to incorporate, due to the many models and definitions developed over the years. This research simplifies this by putting most definition in a template and most models in the Matrix of era six. This research is original as the division of SC into Eras, the Matrix of Era six (OMM) with Fuzzy-Delphi and Heuristic Learning in the AMM of Era seven provides a synergy of tools that were not combined before in the area of SC. Additionally the OMM of Era six is unique as it combines most characteristics of the SC, which is an original concept in itself.Keywords: Leagile, automation, heuristic learning, supply chain models
Procedia PDF Downloads 396215 Challenges in Implementing the Inculcation of Noble Values During Teaching by Primary Schools Teachers in Peninsular Malaysia
Authors: Mohamad Khairi Haji Othman, Mohd Zailani Mohd Yusoff, Rozalina Khalid
Abstract:
The inculcation of noble values in teaching and learning is very important, especially to build students with good characters and values. Therefore, the purpose of this research is to identify the challenges of implementing the inculcation of noble values in teaching in primary schools. This study was conducted at four North Zone Peninsular Malaysia schools. This study was used a qualitative approach in the form of case studies. The qualitative approach aims at gaining meaning and a deep understanding of the phenomenon studied from the perspectives of the study participants and not intended to make the generalization. The sample in this study consists of eight teachers who teach in four types of schools that have been chosen purposively. The method of data collection is through semi-structured interviews used in this study. The comparative method is continuously used in this study to analyze the primary data collected. The study found that the main challenges faced by teachers were students' problems and class control so that teachers felt difficult to the inculcation of noble values in teaching. In addition, the language challenge is difficult for students to understand. Similarly, peers are also challenging because students are more easily influenced by friends rather than listening to teachers' instructions. The last challenge was the influence of technology and mass media electronic more widespread. The findings suggest that teachers need to innovate in order to assist the school in inculcating religious and moral education towards the students. The school through guidance and counseling teachers can also plan some activities that are appropriate to the student's present condition. Through this study, teachers and the school should work together to develop the values of students in line with the needs of the National Education Philosophy that wishes to produce intelligent, emotional, spiritual, intellectual and social human capital.Keywords: challenges, implementation, inculcation, noble values
Procedia PDF Downloads 190214 Undergraduates' Development of Interpersonal and Cooperative Competence in Service-Learning
Authors: Huixuan Xu
Abstract:
The present study was set out to investigate the extent to which and how service-learning fostered a sample of 138 Hong Kong undergraduates’ interpersonal competence and cooperative orientation development. Interpersonal competence is presented when an individual shows empathy with others, provides intelligent advice to others and has practical judgment. Cooperative orientation reflects individuals’ willingness to work with others to achieve common goals. A quality service-learning programme may exhibit the features of provision of meaningful service, close link to curriculum, continuous reflection, youth voice, and diversity. Mixed methods were employed in the present study. Pre-posttest survey was administered to capture individual undergraduates’ development of interpersonal competence and cooperative orientation over a period of four months. The respondents’ evaluation of service-learning elements was administered in the post-test survey. Focus groups were conducted after the end of the service-learning to further explore how the certain service-learning elements promoted individual undergraduates’ development of interpersonal competence and cooperative orientation. Three main findings were reported from the study. (1) The scores of interpersonal competence increased significantly from the pretest to the posttest, while the change of cooperative orientation was not significant. (2) Cooperative orientation and interpersonal competence were correlated positively with the overall course quality respectively, which suggested that the more a service-learning course complied with quality practice, the students became more competent in interpersonal competence and cooperative orientation. (3) The following service-learning elements showed higher impacts: (a) direct contact with service recipients, which engaged students in practicing interpersonal skills; (b) individual participants’ being exposed to a situation that required communication and dialogue with people from diverse backgrounds with different views; (c) experiencing interpersonal conflicts among team members and having the conflicts solved; (d) students’ taking a leading role in a project-based service. The present study provides compelling evidence about what elements in a service-learning program may foster undergraduates’ development of cooperative orientation and interpersonal competence. Implications for the design of service-learning programmes are provided.Keywords: undergraduates, interpersonal competence, cooperation orientation, service-learning
Procedia PDF Downloads 258213 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework
Authors: Abbas Raza Ali
Abstract:
Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation
Procedia PDF Downloads 179212 Digital Transformation in Education: Artificial Intelligence Awareness of Preschool Teachers
Authors: Cansu Bozer, Saadet İrem Turgut
Abstract:
Artificial intelligence (AI) has become one of the most important technologies of the digital age and is transforming many sectors, including education. The advantages offered by AI, such as automation, personalised learning, and data analytics, create new opportunities for both teachers and students in education systems. Preschool education plays a fundamental role in the cognitive, social, and emotional development of children. In this period, the foundations of children's creative thinking, problem-solving, and critical thinking skills are laid. Educational technologies, especially artificial intelligence-based applications, are thought to contribute to the development of these skills. For example, artificial intelligence-supported digital learning tools can support learning processes by offering activities that can be customised according to the individual needs of each child. However, the successful use of artificial intelligence-based applications in preschool education can be realised under the guidance of teachers who have the right knowledge about this technology. Therefore, it is of great importance to measure preschool teachers' awareness levels of artificial intelligence and to understand which variables affect this awareness. The aim of this study is to measure preschool teachers' awareness levels of artificial intelligence and to determine which factors are related to this awareness. In line with this purpose, teachers' level of knowledge about artificial intelligence, their thoughts about the role of artificial intelligence in education, and their attitudes towards artificial intelligence will be evaluated. The study will be conducted with 100 teachers working in Turkey using a descriptive survey model. In this context, ‘Artificial Intelligence Awareness Level Scale for Teachers’ developed by Ferikoğlu and Akgün (2022) will be used. The collected data will be analysed using SPSS (Statistical Package for the Social Sciences) software. Descriptive statistics (frequency, percentage, mean, standard deviation) and relationship analyses (correlation and regression analyses) will be used in data analysis. As a result of the study, the level of artificial intelligence awareness of preschool teachers will be determined, and the factors affecting this awareness will be identified. The findings obtained will contribute to the determination of studies that can be done to increase artificial intelligence awareness in preschool education.Keywords: education, child development, artificial intelligence, preschool teachers
Procedia PDF Downloads 25211 Multi Biomertric Personal Identification System Based On Hybird Intellegence Method
Authors: Laheeb M. Ibrahim, Ibrahim A. Salih
Abstract:
Biometrics is a technology that has been widely used in many official and commercial identification applications. The increased concerns in security during recent years (especially during the last decades) have essentially resulted in more attention being given to biometric-based verification techniques. Here, a novel fusion approach of palmprint, dental traits has been suggested. These traits which are authentication techniques have been employed in a range of biometric applications that can identify any postmortem PM person and antemortem AM. Besides improving the accuracy, the fusion of biometrics has several advantages such as increasing, deterring spoofing activities and reducing enrolment failure. In this paper, a first unimodel biometric system has been made by using (palmprint and dental) traits, for each one classification applying an artificial neural network and a hybrid technique that combines swarm intelligence and neural network together, then attempt has been made to combine palmprint and dental biometrics. Principally, the fusion of palmprint and dental biometrics and their potential application has been explored as biometric identifiers. To address this issue, investigations have been carried out about the relative performance of several statistical data fusion techniques for integrating the information in both unimodal and multimodal biometrics. Also the results of the multimodal approach have been compared with each one of these two traits authentication approaches. This paper studies the features and decision fusion levels in multimodal biometrics. To determine the accuracy of GAR to parallel system decision-fusion including (AND, OR, Majority fating) has been used. The backpropagation method has been used for classification and has come out with result (92%, 99%, 97%) respectively for GAR, while the GAR) for this algorithm using hybrid technique for classification (95%, 99%, 98%) respectively. To determine the accuracy of the multibiometric system for feature level fusion has been used, while the same preceding methods have been used for classification. The results have been (98%, 99%) respectively while to determine the GAR of feature level different methods have been used and have come out with (98%).Keywords: back propagation neural network BP ANN, multibiometric system, parallel system decision-fusion, practical swarm intelligent PSO
Procedia PDF Downloads 535210 Clustering and Modelling Electricity Conductors from 3D Point Clouds in Complex Real-World Environments
Authors: Rahul Paul, Peter Mctaggart, Luke Skinner
Abstract:
Maintaining public safety and network reliability are the core objectives of all electricity distributors globally. For many electricity distributors, managing vegetation clearances from their above ground assets (poles and conductors) is the most important and costly risk mitigation control employed to meet these objectives. Light Detection And Ranging (LiDAR) is widely used by utilities as a cost-effective method to inspect their spatially-distributed assets at scale, often captured using high powered LiDAR scanners attached to fixed wing or rotary aircraft. The resulting 3D point cloud model is used by these utilities to perform engineering grade measurements that guide the prioritisation of vegetation cutting programs. Advances in computer vision and machine-learning approaches are increasingly applied to increase automation and reduce inspection costs and time; however, real-world LiDAR capture variables (e.g., aircraft speed and height) create complexity, noise, and missing data, reducing the effectiveness of these approaches. This paper proposes a method for identifying each conductor from LiDAR data via clustering methods that can precisely reconstruct conductors in complex real-world configurations in the presence of high levels of noise. It proposes 3D catenary models for individual clusters fitted to the captured LiDAR data points using a least square method. An iterative learning process is used to identify potential conductor models between pole pairs. The proposed method identifies the optimum parameters of the catenary function and then fits the LiDAR points to reconstruct the conductors.Keywords: point cloud, LİDAR data, machine learning, computer vision, catenary curve, vegetation management, utility industry
Procedia PDF Downloads 103209 Legal Personality and Responsibility of Robots
Authors: Mehrnoosh Abouzari, Shahrokh Sahraei
Abstract:
Arrival of artificial intelligence or smart robots in the modern world put them in charge on pericise and at risk. So acting human activities with robots makes criminal or civil responsibilities for their acts or behavior. The practical usage of smart robots has entered them in to a unique situation when naturalization happens and smart robots are identifies as members of society. There would be some legal situation by adopting these new smart citizens. The first situation is about legal responsibility of robots. Recognizing the naturalization of robot involves some basic right , so humans have the rights of employment, property, housing, using energy and other human rights may be employed for robots. So how would be the practice of these rights in the society and if some problems happens with these rights, how would the civil responsibility and punishment? May we consider them as population and count on the social programs? The second episode is about the criminal responsibility of robots in important activity instead of human that is the aim of inventing robots with handling works in AI technology , but the problem arises when some accidents are happened by robots who are in charge of important activities like army, surgery, transporting, judgement and so on. Moreover, recognizing independent identification for robots in the legal world by register ID cards, naturalization and civilian rights makes and prepare the same rights and obligations of human. So, the civil responsibility is not avoidable and if the robot commit a crime it would have criminal responsibility and have to be punished. The basic component of criminal responsibility may changes in so situation. For example, if designation for criminal responsibility bounds to human by sane, maturity, voluntariness, it would be for robots by being intelligent, good programming, not being hacked and so on. So it is irrational to punish robots by prisoning , execution and other human punishments for body. We may determine to make digital punishments like changing or repairing programs, exchanging some parts of its body or wreck it down completely. Finally the responsibility of the smart robot creators, programmers, the boss in chief, the organization who employed robot, the government which permitted to use robot in important bases and activities , will be analyzing and investigating in their article.Keywords: robot, artificial intelligence, personality, responsibility
Procedia PDF Downloads 150208 Trip Reduction in Turbo Machinery
Authors: Pranay Mathur, Carlo Michelassi, Simi Karatha, Gilda Pedoto
Abstract:
Industrial plant uptime is top most importance for reliable, profitable & sustainable operation. Trip and failed start has major impact on plant reliability and all plant operators focussed on efforts required to minimise the trips & failed starts. The performance of these CTQs are measured with 2 metrics, MTBT(Mean time between trips) and SR (Starting reliability). These metrics helps to identify top failure modes and identify units need more effort to improve plant reliability. Baker Hughes Trip reduction program structured to reduce these unwanted trip 1. Real time machine operational parameters remotely available and capturing the signature of malfunction including related boundary condition. 2. Real time alerting system based on analytics available remotely. 3. Remote access to trip logs and alarms from control system to identify the cause of events. 4. Continuous support to field engineers by remotely connecting with subject matter expert. 5. Live tracking of key CTQs 6. Benchmark against fleet 7. Break down to the cause of failure to component level 8. Investigate top contributor, identify design and operational root cause 9. Implement corrective and preventive action 10. Assessing effectiveness of implemented solution using reliability growth models. 11. Develop analytics for predictive maintenance With this approach , Baker Hughes team is able to support customer in achieving their Reliability Key performance Indicators for monitored units, huge cost savings for plant operators. This Presentation explains these approach while providing successful case studies, in particular where 12nos. of LNG and Pipeline operators with about 140 gas compressing line-ups has adopted these techniques and significantly reduce the number of trips and improved MTBTKeywords: reliability, availability, sustainability, digital infrastructure, weibull, effectiveness, automation, trips, fail start
Procedia PDF Downloads 80207 Development of a Robot Assisted Centrifugal Casting Machine for Manufacturing Multi-Layer Journal Bearing and High-Tech Machine Components
Authors: Mohammad Syed Ali Molla, Mohammed Azim, Mohammad Esharuzzaman
Abstract:
Centrifugal-casting machine is used in manufacturing special machine components like multi-layer journal bearing used in all internal combustion engine, steam, gas turbine and air craft turboengine where isotropic properties and high precisions are desired. Moreover, this machine can be used in manufacturing thin wall hightech machine components like cylinder liners and piston rings of IC engine and other machine parts like sleeves, and bushes. Heavy-duty machine component like railway wheel can also be prepared by centrifugal casting. A lot of technological developments are required in casting process for production of good casted machine body and machine parts. Usually defects like blowholes, surface roughness, chilled surface etc. are found in sand casted machine parts. But these can be removed by centrifugal casting machine using rotating metallic die. Moreover, die rotation, its temperature control, and good pouring practice can contribute to the quality of casting because of the fact that the soundness of a casting in large part depends upon how the metal enters into the mold or dies and solidifies. Poor pouring practice leads to variety of casting defects such as temperature loss, low quality casting, excessive turbulence, over pouring etc. Besides these, handling of molten metal is very unsecured and dangerous for the workers. In order to get rid of all these problems, the need of an automatic pouring device arises. In this research work, a robot assisted pouring device and a centrifugal casting machine are designed, developed constructed and tested experimentally which are found to work satisfactorily. The robot assisted pouring device is further modified and developed for using it in actual metal casting process. Lot of settings and tests are required to control the system and ultimately it can be used in automation of centrifugal casting machine to produce high-tech machine parts with desired precision.Keywords: bearing, centrifugal casting, cylinder liners, robot
Procedia PDF Downloads 418206 Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 76205 A Distributed Mobile Agent Based on Intrusion Detection System for MANET
Authors: Maad Kamal Al-Anni
Abstract:
This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)
Procedia PDF Downloads 196