Search results for: information warfare techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16307

Search results for: information warfare techniques

15197 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling

Authors: Vibha Devi, Shabina Khanam

Abstract:

Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.

Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation

Procedia PDF Downloads 137
15196 A Survey on Genetic Algorithm for Intrusion Detection System

Authors: Prikhil Agrawal, N. Priyanka

Abstract:

With the increase of millions of users on Internet day by day, it is very essential to maintain highly reliable and secured data communication between various corporations. Although there are various traditional security imparting techniques such as antivirus software, password protection, data encryption, biometrics and firewall etc. But still network security has become the main issue in various leading companies. So IDSs have become an essential component in terms of security, as it can detect various network attacks and respond quickly to such occurrences. IDSs are used to detect unauthorized access to a computer system. This paper describes various intrusion detection techniques using GA approach. The intrusion detection problem has become a challenging task due to the conception of miscellaneous computer networks under various vulnerabilities. Thus the damage caused to various organizations by malicious intrusions can be mitigated and even be deterred by using this powerful tool.

Keywords: genetic algorithm (GA), intrusion detection system (IDS), dataset, network security

Procedia PDF Downloads 292
15195 Understanding Sixteen Basic Desires and Modern Approaches to Agile Team Motivation: Case Study

Authors: Anna Suvorova

Abstract:

Classical motivation theories hold that there are two kinds of motivation, intrinsic and extrinsic. Leaders are looking for effective motivation techniques, but frequently external influences do not work or, even worse, reduce team productivity. We see only the tip of the iceberg -human behavior. However, beneath the surface of the water are factors that directly affect our behavior -desires. Believing that employees need to be motivated, companies design a motivation system based on the principle: do it and get a reward. As a matter of fact, we all have basic desires. Everybody is motivated but to different extents. Following the principle "intrinsic motivation over extrinsic rewards", we need to create an environment that will support intrinsic motivation and potential of employees, and team, rather than individual work.

Keywords: motivation profile, motivation techniques, agile HR, basic desires, agile people, human behavior, people management

Procedia PDF Downloads 109
15194 Construction Information Visualization System Using nD CAD Model

Authors: Hyeon-seoung Kim, Sang-mi Park, Sun-ju Han, Leen-seok Kang

Abstract:

The visualization technology of construction information using 3D and nD modeling can satisfy the visualization needs of each construction project participant. The nD CAD system is a tool that the construction information, such as construction schedule, cost and resource utilization, are simulated by 4D, 5D and 6D object formats based on 3D object. This study developed a methodology and simulation engine for nD CAD system for construction project management. It has improved functions such as built-in schedule generation, cost simulation of changed budget and built-in resource allocation comparing with the current systems. To develop an integrated nD CAD system, this study attempts an integrated method to link 5D and 6D objects based on 4D object.

Keywords: building information modeling, visual simulation, 3D object, nD CAD augmented reality

Procedia PDF Downloads 308
15193 Assessment of Environmental Implications of Rapid Population Growth on Land Use Dynamics: A Case Study of Eleme Local Government Area, Rivers State, Nigeria

Authors: Moses Obenade, Henry U. Okeke, Francis I. Okpiliya, Eugene J. Aniah

Abstract:

Population growth in Eleme has been rapid over the past 75 years with its attendant pressure on the natural resources of the area. Between 1937 and 2006 the population of Eleme grew from 2,528 to 190,194 and is projected to be above 265,707 in 2016 based on an annual growth rate of 3.4%. Using the combined technologies of Geographic Information Systems (GIS), remote sensing (RS) and Demography techniques as its methodology, this paper examines the environmental implications of rapid population growth on land use dynamics in Eleme between 1986 and 2015. The study reveals that between 1986 and 2006, Built-up area and Farmland increased by 72.67 and 12.77% respectively, while light and thick vegetation recorded a decrease of -6.92 and -61.64% respectively. Water body remains fairly constant with minimal changes. Also, between 2006 and 2015 covering a period of 9 years, Built-up area further increased by 53% with an annual growth rate of 2.32 km2 gaining more land area on the detriment of other land uses. Built-up area has an annual growth rate of 2.32km2 and is expected to increase from 18.67km2 in 2006 to 41.87km2 in 2016.The observed Land used/Land cover dynamics is derived by the demographic characteristics of the Study area. Eleme has a total area of 138km2 out of which the Federal Government of Nigeria compulsorily acquired an estimated area of 59.34km2 for industrial purposes excluding acquisitions by the Rivers State Government. It is evident from the findings of this study that the carrying capacity of Eleme ecosystem is under threat due to the current population growth and land consumption rates. Therefore, measures such as use of appropriate technologies in farming techniques, waste management; investment in family planning and female empowerment, maternal health and education, afforestation programs; and amendment of Land Use Act of 1978 are recommended.

Keywords: population growth, Eleme, land use, GIS and remote sensing

Procedia PDF Downloads 375
15192 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand

Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones

Abstract:

As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.

Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem

Procedia PDF Downloads 244
15191 Predicting Seoul Bus Ridership Using Artificial Neural Network Algorithm with Smartcard Data

Authors: Hosuk Shin, Young-Hyun Seo, Eunhak Lee, Seung-Young Kho

Abstract:

Currently, in Seoul, users have the privilege to avoid riding crowded buses with the installation of Bus Information System (BIS). BIS has three levels of on-board bus ridership level information (spacious, normal, and crowded). However, there are flaws in the system due to it being real time which could provide incomplete information to the user. For example, a bus comes to the station, and on the BIS it shows that the bus is crowded, but on the stop that the user is waiting many people get off, which would mean that this station the information should show as normal or spacious. To fix this problem, this study predicts the bus ridership level using smart card data to provide more accurate information about the passenger ridership level on the bus. An Artificial Neural Network (ANN) is an interconnected group of nodes, that was created based on the human brain. Forecasting has been one of the major applications of ANN due to the data-driven self-adaptive methods of the algorithm itself. According to the results, the ANN algorithm was stable and robust with somewhat small error ratio, so the results were rational and reasonable.

Keywords: smartcard data, ANN, bus, ridership

Procedia PDF Downloads 162
15190 Leachate Discharges: Review Treatment Techniques

Authors: Abdelkader Anouzla, Soukaina Bouaouda, Roukaya Bouyakhsass, Salah Souabi, Abdeslam Taleb

Abstract:

During storage and under the combined action of rainwater and natural fermentation, these wastes produce over 800.000 m3 of landfill leachates. Due to population growth and changing global economic activities, the amount of waste constantly generated increases, making more significant volumes of leachate. Leachate, when leaching into the soil, can negatively impact soil, surface water, groundwater, and the overall environment and human life. The leachate must first be treated because of its high pollutant load before being released into the environment. This article reviews the different leachate treatments in September 2022 techniques. Different techniques can be used for this purpose, such as biological, physical-chemical, and membrane methods. Young leachate is biodegradable; in contrast, these biological processes lose their effectiveness with leachate aging. They are characterized by high ammonia nitrogen concentrations that inhibit their activity. Most physical-chemical treatments serve as pre-treatment or post-treatment to complement conventional treatment processes or remove specific contaminants. After the introduction, the different types of pollutants present in leachates and their impacts have been made, followed by a discussion highlighting the advantages and disadvantages of the various treatments, whether biological, physicochemical, or membrane. From this work, due to their simplicity and reasonable cost compared to other treatment procedures, biological treatments offer the most suitable alternative to limit the effects produced by the pollutants in landfill leachates.

Keywords: landfill leachate, landfill pollution, impact, wastewater

Procedia PDF Downloads 87
15189 Context-Aware Recommender System Using Collaborative Filtering, Content-Based Algorithm and Fuzzy Rules

Authors: Xochilt Ramirez-Garcia, Mario Garcia-Valdez

Abstract:

Contextual recommendations are implemented in Recommender Systems to improve user satisfaction, recommender system makes accurate and suitable recommendations for a particular situation reaching personalized recommendations. The context provides information relevant to the Recommender System and is used as a filter for selection of relevant items for the user. This paper presents a Context-aware Recommender System, which uses techniques based on Collaborative Filtering and Content-Based, as well as fuzzy rules, to recommend items inside the context. The dataset used to test the system is Trip Advisor. The accuracy in the recommendations was evaluated with the Mean Absolute Error.

Keywords: algorithms, collaborative filtering, intelligent systems, fuzzy logic, recommender systems

Procedia PDF Downloads 418
15188 Information and Communication Technology in Architectural Education: The Challenges

Authors: Oluropo Stephen Ilesanmi, Oluwole Ayodele Alejo

Abstract:

Architectural education, beyond training the students to become architects, impacts in them the appreciation of the responsibilities relating to public health, safety, and welfare. Architecture is no longer a personal philosophical or aesthetic pursuit by individuals, rather, it has to consider everyday needs of the people and use technology to give a liveable environment. In the present age, architectural education must have to grapple with the recent integration of technology, in particular, facilities offered by information and communication technology. Electronic technologies have moved architecture from the drawing board to cyberspace. The world is now a global village in which new information and methods are easily and quickly available to people at different poles of the globe. It is the position of this paper that in order to remain relevant in the ever-competing forces within the building industry, architectural education must show the impetus to continue to draw from technological advancements associated with the use of computers.

Keywords: architecture, education, communication, information, technology

Procedia PDF Downloads 205
15187 Spatial Behavioral Model-Based Dynamic Data-Driven Diagram Information Model

Authors: Chiung-Hui Chen

Abstract:

Diagram and drawing are important ways to communicate and the reproduce of architectural design, Due to the development of information and communication technology, the professional thinking of architecture and interior design are also change rapidly. In development process of design, diagram always play very important role. This study is based on diagram theories, observe and record interaction between man and objects, objects and space, and space and time in a modern nuclear family. Construct a method for diagram to systematically and visualized describe the space plan of a modern nuclear family toward a intelligent design, to assist designer to retrieve information and check/review event pattern of past and present.

Keywords: digital diagram, information model, context aware, data analysis

Procedia PDF Downloads 329
15186 Access to Climate Change Information Through the Implementation of the Paris Agreement

Authors: Ana Cristina A. P. Carvalho, Solange Teles Da Silva

Abstract:

In April, 174 countries signed the Paris Agreement, a multilateral agreement on climate change which deals with greenhouse gas emissions, mitigation, adaptation, finance, access to information, transparency, among other subjects related to the environment. Since then, Parties shall cooperate in taking measures, as appropriate, to enhance climate change education, training, public awareness, public participation and public access to information, recognizing the importance of these steps with respect to enhancing actions under this Agreement. This paper aims to analyze the consequences of this new rule in terms of the implementation of the Agreement, collecting data from Brazilian and Canadian legislations in order to identify if these countries have rules complying with the Treaty, the steps that have been already taken and if they could be used as examples for other countries. The analysis will take into consideration the different kinds of climate change information, means of transparency, reliability of the data and how to spread the information. The methodology comprehends a comparative legal research based on both the Paris Agreement and domestic laws of Brazil and Canada, as well as on doctrine and Court decisions. The findings can contribute to the implementation of the Paris Agreement through compliance with this Treaty at countries’ domestic and policy level.

Keywords: climate change information, domestic legislation, Paris Agreement, public policy

Procedia PDF Downloads 305
15185 Effect of Anion Variation on the CO2 Capture Performance of Pyridinium Containing Poly(ionic liquid)s

Authors: Sonia Zulfiqar, Daniele Mantione, Muhammad Ilyas Sarwar, Alexander Rothenberger, David Mecerreyes

Abstract:

Climate change due to escalating carbon dioxide concentration in the atmosphere is an issue of paramount importance that needs immediate attention. CO2 capture and sequestration (CCS) is a promising route to mitigate climate change and adsorption is the most widely recognized technology owing to possible energy savings relative to the conventional absorption techniques. In this conference, the potential of a new family of solid sorbents for CO2 capture and separation will be presented. Novel pyridinium containing poly(ionic liquid)s (PILs) were synthesized with varying anions i.e bis(trifluoromethylsulfonyl)imide and hexafluorophosphate. The resulting polymers were characterized using NMR, XRD, TGA, BET surface area and microscopic techniques. Furthermore, CO2 adsorption measurements at two different temperatures were also carried out and revealed great potential of these PILs as CO2 scavengers.

Keywords: climate change, CO2 capture, poly(ionic liquid)s, CO2/N2 selectivity

Procedia PDF Downloads 369
15184 Lecture Video Indexing and Retrieval Using Topic Keywords

Authors: B. J. Sandesh, Saurabha Jirgi, S. Vidya, Prakash Eljer, Gowri Srinivasa

Abstract:

In this paper, we propose a framework to help users to search and retrieve the portions in the lecture video of their interest. This is achieved by temporally segmenting and indexing the lecture video using the topic keywords. We use transcribed text from the video and documents relevant to the video topic extracted from the web for this purpose. The keywords for indexing are found by applying the non-negative matrix factorization (NMF) topic modeling techniques on the web documents. Our proposed technique first creates indices on the transcribed documents using the topic keywords, and these are mapped to the video to find the start and end time of the portions of the video for a particular topic. This time information is stored in the index table along with the topic keyword which is used to retrieve the specific portions of the video for the query provided by the users.

Keywords: video indexing and retrieval, lecture videos, content based video search, multimodal indexing

Procedia PDF Downloads 246
15183 Analysis of Buddhist Rock Carvings in Diamer Basha Dam Reservoir Area, Gilgit-Baltistan, Pakistan

Authors: Abdul Ghani Khan

Abstract:

This paper focuses on the Buddhist rock carvings in the Diamer-Basha reservoir area, Gilgit-Baltistan, which is perhaps the largest rock art province of the world. The study region has thousands of rock carvings, particularly of the stupa carvings, engraved by artists, devotees or pilgrims, merchants have left their marks in the landscape or for the propagation of Buddhism. The Pak-German Archaeological Mission prepared, documented, and published the extensive catalogues of these carvings. Though, to date, very little systematic or statistically driven analysis was undertaken for in-depth understandings of the Buddhist rock carving tradition of the study region. This paper had made an attempt to examine stupa carvings and their constituent parts from the five selected sites, namely Oshibat, Shing Nala, Gichi Nala, Dadam Das, and Chilas Bridge. The statistical analyses and classification of the stupa carvings and their chronological contexts were carried out with the help of modern scientific tools such as STATA, FileMaker Pro, and MapSource softwares. The study had found that the tradition of stupa carvings on the surfaces of the rocks at the five selected sites continued for around 900 years, from the 1st century BCE to 8th century CE. There is a variation within the chronological settings of each of selected sites, possibly impacted by their utilization within particular landscapes, such as political (for example, change in political administrations or warfare) landscapes and geographical (for example, shifting of routes). The longer existence of the stupa carvings' tradition at these specific locations also indicates their central position on the trade and communication routes, and these were possibly also linked with religious ideologies within their particular times. The analyses of the different architectural elements of stupa carvings in the study area show that this tradition had structural similarities and differences in temporal and spatial contexts.

Keywords: rock carvings, stupa, stupa carvings, Buddhism, Pak-German archaeological mission

Procedia PDF Downloads 219
15182 Parallels Between Indian Art Music and Western Art Music: The Suppression of the Notion of the 'Melody'

Authors: Kedarnath Awati

Abstract:

Some parallels between Indian Art Music and Western Art Music, such as the identity of the basic heptatonic scale structure, are quite obvious and need no further discussion. Other parallels are far less obvious, and it is one of them that the author is interested in. Specifically, the author would like to make a serious claim that in both types of music, there is an unspoken dependence on melody. Yes, it is true that the techniques that the two systems use for elaboration are very, very different: Western music uses the techniques of harmony, counterpoint, orchestration and motivic variation, while the Indian systems, both the Hindustani and the Carnatic traditions use the technique of raagdaari. The reason that this point is barely spoken about is that both in the West as well as in India, artists tend to think of melody as something elementary or as something 'given'. The Indian musicians would much rather dwell upon this or that meend or taan or other technical device, while the West thinks that melody is passé and would rather discuss the merits and demerits of spectralism and perhaps serialism. The author would like to explore this theme further in his paper.

Keywords: Indian art music, Western art music, melody, raagdaari, motivic variation.

Procedia PDF Downloads 60
15181 Comparison of Different Data Acquisition Techniques for Shape Optimization Problems

Authors: Attila Vámosi, Tamás Mankovits, Dávid Huri, Imre Kocsis, Tamás Szabó

Abstract:

Non-linear FEM calculations are indispensable when important technical information like operating performance of a rubber component is desired. Rubber bumpers built into air-spring structures may undergo large deformations under load, which in itself shows non-linear behavior. The changing contact range between the parts and the incompressibility of the rubber increases this non-linear behavior further. The material characterization of an elastomeric component is also a demanding engineering task. The shape optimization problem of rubber parts led to the study of FEM based calculation processes. This type of problems was posed and investigated by several authors. In this paper the time demand of certain calculation methods are studied and the possibilities of time reduction is presented.

Keywords: rubber bumper, data acquisition, finite element analysis, support vector regression

Procedia PDF Downloads 469
15180 Development of Residual Power Series Methods for Efficient Solutions of Stiff Differential Equations

Authors: Gebreegziabher Hailu

Abstract:

This paper presents the development of residual power series methods aimed at efficiently solving stiff differential equations, which pose significant challenges in numerical analysis due to their rapid changes in solution behavior. The RPSM is a numerical approach that generates polynomial-based approximate solutions without the need for linearization, discretization, or perturbation techniques, making it straightforward to implement and less prone to computational errors. We introduce an approach that utilizes power series expansions combined with residual minimization techniques to enhance convergence and stability. By analyzing the theoretical foundations of stiffness, we delve into the formulation of the residual power series method, detailing how it effectively captures the dynamics of stiff systems while maintaining computational efficiency. Numerical experiments demonstrate the method's superiority in terms of accuracy and computational cost when compared to traditional methods like implicit Runge-Kutta or multistep techniques. We also explore adaptive strategies within our framework to automatically adjust parameters based on the stiffness characteristics of the problem at hand. Ultimately, our findings contribute to the broader toolkit for tackling stiff differential equations, offering a robust alternative that promises to streamline computational workflows in various applied mathematics and engineering contexts.

Keywords: residual power series methods, stiff differential equoations, numerical approach, Runge Kutta methods

Procedia PDF Downloads 9
15179 A Deep Learning Approach to Subsection Identification in Electronic Health Records

Authors: Nitin Shravan, Sudarsun Santhiappan, B. Sivaselvan

Abstract:

Subsection identification, in the context of Electronic Health Records (EHRs), is identifying the important sections for down-stream tasks like auto-coding. In this work, we classify the text present in EHRs according to their information, using machine learning and deep learning techniques. We initially describe briefly about the problem and formulate it as a text classification problem. Then, we discuss upon the methods from the literature. We try two approaches - traditional feature extraction based machine learning methods and deep learning methods. Through experiments on a private dataset, we establish that the deep learning methods perform better than the feature extraction based Machine Learning Models.

Keywords: deep learning, machine learning, semantic clinical classification, subsection identification, text classification

Procedia PDF Downloads 209
15178 Holistic Simulation-Based Impact Analysis Framework for Sustainable Manufacturing

Authors: Mijoh A. Gbededo, Kapila Liyanage, Sabuj Mallik

Abstract:

The emerging approaches to sustainable manufacturing are considered to be solution-oriented with the aim of addressing the environmental, economic and social issues holistically. However, the analysis of the interdependencies amongst the three sustainability dimensions has not been fully captured in the literature. In a recent review of approaches to sustainable manufacturing, two categories of techniques are identified: 1) Sustainable Product Development (SPD), and 2) Sustainability Performance Assessment (SPA) techniques. The challenges of the approaches are not only related to the arguments and misconceptions of the relationships between the techniques and sustainable development but also to the inability to capture and integrate the three sustainability dimensions. This requires a clear definition of some of the approaches and a road-map to the development of a holistic approach that supports sustainability decision-making. In this context, eco-innovation, social impact assessment, and life cycle sustainability analysis play an important role. This paper deployed an integrative approach that enabled amalgamation of sustainable manufacturing approaches and the theories of reciprocity and motivation into a holistic simulation-based impact analysis framework. The findings in this research have the potential to guide sustainability analysts to capture the aspects of the three sustainability dimensions into an analytical model. Additionally, the research findings presented can aid the construction of a holistic simulation model of a sustainable manufacturing and support effective decision-making.

Keywords: life cycle sustainability analysis, sustainable manufacturing, sustainability performance assessment, sustainable product development

Procedia PDF Downloads 171
15177 Forensic Analysis of Thumbnail Images in Windows 10

Authors: George Kurian, Hongmei Chi

Abstract:

Digital evidence plays a critical role in most legal investigations. In many cases, thumbnail databases show important information in that investigation. The probability of having digital evidence retrieved from a computer or smart device has increased, even though the previous user removed data and deleted apps on those devices. Due to the increase in digital forensics, the ability to store residual information from various thumbnail applications has improved. This paper will focus on investigating thumbnail information from Windows 10. Thumbnail images of interest in forensic investigations may be intact even when the original pictures have been deleted. It is our research goal to recover useful information from thumbnails. In this research project, we use various forensics tools to collect left thumbnail information from deleted videos or pictures. We examine and describe the various thumbnail sources in Windows and propose a methodology for thumbnail collection and analysis from laptops or desktops. A machine learning algorithm is adopted to help speed up content from thumbnail pictures.

Keywords: digital forensic, forensic tools, soundness, thumbnail, machine learning, OCR

Procedia PDF Downloads 127
15176 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques

Authors: Tosin Ige

Abstract:

Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.

Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique

Procedia PDF Downloads 165
15175 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data

Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill

Abstract:

Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.

Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function

Procedia PDF Downloads 274
15174 A Distributed Mobile Agent Based on Intrusion Detection System for MANET

Authors: Maad Kamal Al-Anni

Abstract:

This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the  signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness  for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).

Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)

Procedia PDF Downloads 190
15173 Performance Analysis with the Combination of Visualization and Classification Technique for Medical Chatbot

Authors: Shajida M., Sakthiyadharshini N. P., Kamalesh S., Aswitha B.

Abstract:

Natural Language Processing (NLP) continues to play a strategic part in complaint discovery and medicine discovery during the current epidemic. This abstract provides an overview of performance analysis with a combination of visualization and classification techniques of NLP for a medical chatbot. Sentiment analysis is an important aspect of NLP that is used to determine the emotional tone behind a piece of text. This technique has been applied to various domains, including medical chatbots. In this, we have compared the combination of the decision tree with heatmap and Naïve Bayes with Word Cloud. The performance of the chatbot was evaluated using accuracy, and the results indicate that the combination of visualization and classification techniques significantly improves the chatbot's performance.

Keywords: sentimental analysis, NLP, medical chatbot, decision tree, heatmap, naïve bayes, word cloud

Procedia PDF Downloads 68
15172 Artificial Intelligence for Generative Modelling

Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta

Abstract:

As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.

Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques

Procedia PDF Downloads 143
15171 Identification of Failures Occurring on a System on Chip Exposed to a Neutron Beam for Safety Applications

Authors: S. Thomet, S. De-Paoli, F. Ghaffari, J. M. Daveau, P. Roche, O. Romain

Abstract:

In this paper, we present a hardware module dedicated to understanding the fail reason of a System on Chip (SoC) exposed to a particle beam. Impact of Single-Event Effects (SEE) on processor-based SoCs is a concern that has increased in the past decade, particularly for terrestrial applications with automotive safety increasing requirements, as well as consumer and industrial domains. The SEE created by the impact of a particle on an SoC may have consequences that can end to instability or crashes. Specific hardening techniques for hardware and software have been developed to make such systems more reliable. SoC is then qualified using cosmic ray Accelerated Soft-Error Rate (ASER) to ensure the Soft-Error Rate (SER) remains in mission profiles. Understanding where errors are occurring is another challenge because of the complexity of operations performed in an SoC. Common techniques to monitor an SoC running under a beam are based on non-intrusive debug, consisting of recording the program counter and doing some consistency checking on the fly. To detect and understand SEE, we have developed a module embedded within the SoC that provide support for recording probes, hardware watchpoints, and a memory mapped register bank dedicated to software usage. To identify CPU failure modes and the most important resources to probe, we have carried out a fault injection campaign on the RTL model of the SoC. Probes are placed on generic CPU registers and bus accesses. They highlight the propagation of errors and allow identifying the failure modes. Typical resulting errors are bit-flips in resources creating bad addresses, illegal instructions, longer than expected loops, or incorrect bus accesses. Although our module is processor agnostic, it has been interfaced to a RISC-V by probing some of the processor registers. Probes are then recorded in a ring buffer. Associated hardware watchpoints are allowing to do some control, such as start or stop event recording or halt the processor. Finally, the module is also providing a bank of registers where the firmware running on the SoC can log information. Typical usage is for operating system context switch recording. The module is connected to a dedicated debug bus and is interfaced to a remote controller via a debugger link. Thus, a remote controller can interact with the monitoring module without any intrusiveness on the SoC. Moreover, in case of CPU unresponsiveness, or system-bus stall, the recorded information can still be recovered, providing the fail reason. A preliminary version of the module has been integrated into a test chip currently being manufactured at ST in 28-nm FDSOI technology. The module has been triplicated to provide reliable information on the SoC behavior. As the primary application domain is automotive and safety, the efficiency of the module will be evaluated by exposing the test chip under a fast-neutron beam by the end of the year. In the meantime, it will be tested with alpha particles and electromagnetic fault injection (EMFI). We will report in the paper on fault-injection results as well as irradiation results.

Keywords: fault injection, SoC fail reason, SoC soft error rate, terrestrial application

Procedia PDF Downloads 228
15170 A Survey on Ambient Intelligence in Agricultural Technology

Authors: C. Angel, S. Asha

Abstract:

Despite the advances made in various new technologies, application of these technologies for agriculture still remains a formidable task, as it involves integration of diverse domains for monitoring the different process involved in agricultural management. Advances in ambient intelligence technology represents one of the most powerful technology for increasing the yield of agricultural crops and to mitigate the impact of water scarcity, climatic change and methods for managing pests, weeds, and diseases. This paper proposes a GPS-assisted, machine to machine solutions that combine information collected by multiple sensors for the automated management of paddy crops. To maintain the economic viability of paddy cultivation, the various techniques used in agriculture are discussed and a novel system which uses ambient intelligence technique is proposed in this paper. The ambient intelligence based agricultural system gives a great scope.

Keywords: ambient intelligence, agricultural technology, smart agriculture, precise farming

Procedia PDF Downloads 602
15169 Implementation of the Outputs of Computer Simulation to Support Decision-Making Processes

Authors: Jiri Barta

Abstract:

At the present time, awareness, education, computer simulation and information systems protection are very serious and relevant topics. The article deals with perspectives and possibilities of implementation of emergence or natural hazard threats into the system which is developed for communication among members of crisis management staffs. The Czech Hydro-Meteorological Institute with its System of Integrated Warning Service resents the largest usable base of information. National information systems are connected to foreign systems, especially to flooding emergency systems of neighboring countries, systems of European Union and international organizations where the Czech Republic is a member. Use of outputs of particular information systems and computer simulations on a single communication interface of information system for communication among members of crisis management staff and setting the site interoperability in the net will lead to time savings in decision-making processes in solving extraordinary events and crisis situations. Faster managing of an extraordinary event or a crisis situation will bring positive effects and minimize the impact of negative effects on the environment.

Keywords: computer simulation, communication, continuity, critical infrastructure, information systems, safety

Procedia PDF Downloads 331
15168 Merging of Results in Distributed Information Retrieval Systems

Authors: Larbi Guezouli, Imane Azzouz

Abstract:

This work is located in the domain of distributed information retrieval ‘DIR’. A simplified view of the DIR requires a multi-search in a set of collections, which forces the system to analyze results found in these collections, and merge results back before sending them to the user in a single list. Our work is to find a fusion method based on the relevance score of each result received from collections and the relevance of the local search engine of each collection.

Keywords: information retrieval, distributed IR systems, merging results, datamining

Procedia PDF Downloads 331