Search results for: computer aided navigation
1634 Structural Design for Effective Load Balancing of the Iron Frame in Manhole Lid
Authors: Byung Il You, Ryun Oh, Gyo Woo Lee
Abstract:
Manhole refers to facilities that are accessible to the people cleaning and inspection of sewer, and its covering is called manhole lid. Manhole lid is typically made of a cast iron material. Due to the heavy weight of the cast iron manhole lids their installation and maintenance are not easy, and an electrical shock and corrosion aging of them can cause critical problems. The manhole body and the lid manufacturing using the fiber-reinforced composite material can reduce the weight considerably compared to the cast iron manhole. But only the fiber reinforcing is hard to maintain the heavy load, and the method of the iron frame with double injection molding of the composite material has been proposed widely. In this study reflecting the situation of this market, the structural design of the iron frame for the composite manhole lid was carried out. Structural analysis with the computer simulation for the effectively distributed load on the iron frame was conducted. In addition, we want to assess manufacturing costs through the comparing of weights and number of welding spots of the frames. Despite the cross-sectional area is up to 38% compared with the basic solid form the maximum von Mises stress is increased at least about 7 times locally near the rim and the maximum strain in the central part of the lid is about 5.5 times. The number of welding points related to the manufacturing cost was increased gradually with the more complicated shape. Also, the higher the height of the arch in the center of the lid the better result might be obtained. But considering the economic aspect of the composite fabrication we determined the same thickness as the frame for the height of the arch at the center of the lid. Additionally in consideration of the number of the welding points we selected the hexagonal as the optimal shape. Acknowledgment: These are results of a study on the 'Leaders Industry-university Cooperation' Project, supported by the Ministry of Education (MOE).Keywords: manhole lid, iron frame, structural design, computer simulation
Procedia PDF Downloads 2751633 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: image processing, illumination equalization, shadow filtering, object detection
Procedia PDF Downloads 2161632 A Systematic Review on Lifelong Learning Programs for Community-Dwelling Older Adults
Authors: Xi Vivien Wu, Emily Neo Kim Ang, Yi Jung Tung, Wenru Wang
Abstract:
Background and Objective: The increase in life expectancy and emphasis on self-reliance for the older adults are global phenomena. As such, lifelong learning in the community is considered a viable means of promoting successful and active aging. This systematic review aims to examine various lifelong learning programs for community-dwelling older adults and to synthesize the contents and outcomes of these lifelong learning programs. Methods: A systematic search was conducted in July to December 2016. Two reviewers were engaged in the process to ensure creditability of the selection process. Narrative description and analysis were applied with the support of a tabulation of key data including study design, interventions, and outcomes. Results: Eleven articles, which consisted of five randomized controlled trials and six quasi-experimental studies, were included in this review. Interventions included e-health literacy programs with the aid of computers and the Internet (n=4), computer and Internet training (n=3), physical fitness programs (n=2), music program (n=1), and intergenerational program (n=1). All studies used objective measurement tools to evaluate the outcomes of the study. Conclusion: The systematic review indicated lifelong learning programs resulted in positive outcomes in terms of physical health, mental health, social behavior, social support, self-efficacy and confidence in computer usage, and increased e-health literacy efficacy. However, the lifelong learning programs face challenges such as funding shortages, program cuts, and increasing costs. A comprehensive lifelong learning program could be developed to enhance the well-being of the older adults at a more holistic level. Empirical research can be done to explore the effectiveness of this comprehensive lifelong learning program.Keywords: community-dwelling older adults, e-health literacy program, lifelong learning program, the wellbeing of the older adults
Procedia PDF Downloads 1641631 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference
Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade
Abstract:
In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory
Procedia PDF Downloads 891630 CRM Cloud Computing: An Efficient and Cost Effective Tool to Improve Customer Interactions
Authors: Gaurangi Saxena, Ravindra Saxena
Abstract:
Lately, cloud computing is used to enhance the ability to attain corporate goals more effectively and efficiently at lower cost. This new computing paradigm “The Cloud Computing” has emerged as a powerful tool for optimum utilization of resources and gaining competitiveness through cost reduction and achieving business goals with greater flexibility. Realizing the importance of this new technique, most of the well known companies in computer industry like Microsoft, IBM, Google and Apple are spending millions of dollars in researching cloud computing and investigating the possibility of producing interface hardware for cloud computing systems. It is believed that by using the right middleware, a cloud computing system can execute all the programs a normal computer could run. Potentially, everything from most simple generic word processing software to highly specialized and customized programs designed for specific company could work successfully on a cloud computing system. A Cloud is a pool of virtualized computer resources. Clouds are not limited to grid environments, but also support “interactive user-facing applications” such as web applications and three-tier architectures. Cloud Computing is not a fundamentally new paradigm. It draws on existing technologies and approaches, such as utility Computing, Software-as-a-service, distributed computing, and centralized data centers. Some companies rent physical space to store servers and databases because they don’t have it available on site. Cloud computing gives these companies the option of storing data on someone else’s hardware, removing the need for physical space on the front end. Prominent service providers like Amazon, Google, SUN, IBM, Oracle, Salesforce etc. are extending computing infrastructures and platforms as a core for providing top-level services for computation, storage, database and applications. Application services could be email, office applications, finance, video, audio and data processing. By using cloud computing system a company can improve its customer relationship management. A CRM cloud computing system may be highly useful in delivering a sales team a blend of unique functionalities to improve agent/customer interactions. This paper attempts to first define the cloud computing as a tool for running business activities more effectively and efficiently at a lower cost; and then it distinguishes cloud computing with grid computing. Based on exhaustive literature review, authors discuss application of cloud computing in different disciplines of management especially in the field of marketing with special reference to use of cloud computing in CRM. Study concludes that CRM cloud computing platform helps a company track any data, such as orders, discounts, references, competitors and many more. By using CRM cloud computing, companies can improve its customer interactions and by serving them more efficiently that too at a lower cost can help gaining competitive advantage.Keywords: cloud computing, competitive advantage, customer relationship management, grid computing
Procedia PDF Downloads 3121629 Developing a Toolkit of Undergraduate Nursing Student’ Desirable Characteristics (TNDC) : An application Item Response Theory
Authors: Parinyaporn Thanaboonpuang, Siridej Sujiva, Shotiga Pasiphul
Abstract:
The higher education reform that integration of nursing programmes into the higher education system. Learning outcomes represent one of the essential building blocks for transparency within higher education systems and qualifications. The purpose of this study is to develop a toolkit of undergraduate nursing student’desirable characteristics assessment on Thai Qualifications Framework for Higher education and to test psychometric property for this instrument. This toolkit seeks to improve on the Computer Multimedia test. There are three skills to be examined: Cognitive skill, Responsibility and Interpersonal Skill, and Information Technology Skill. The study was conduct in 4 phases. In Phase 1. Based on developed a measurement model and Computer Multimedia test. Phase 2 two round focus group were conducted, to determine the content validity of measurement model and the toolkit. In Phase 3, data were collected using a multistage random sampling of 1,156 senior undergraduate nursing student were recruited to test psychometric property. In Phase 4 data analysis was conducted by descriptive statistics, item analysis, inter-rater reliability, exploratory factor analysis and confirmatory factor analysis. The resulting TNDC consists of 74 items across the following four domains: Cognitive skill, Interpersonal Skill, Responsibility and Information Technology Skill. The value of Cronbach’ s alpha for the four domains were .781, 807, .831, and .865, respectively. The final model in confirmatory factor analysis fit quite well with empirical data. The TNDC was found to be appropriate, both theoretically and statistically. Due to these results, it is recommended that the toolkit could be used in future studies for Nursing Program in Thailand.Keywords: toolkit, nursing student’ desirable characteristics, Thai qualifications framework
Procedia PDF Downloads 5351628 Seismic Response of Viscoelastic Dampers for Steel Structures
Authors: Ali Khoshraftar, S. A. Hashemi
Abstract:
This paper is focused on the advantages of Viscoelastic Dampers (VED) to be used as energy-absorbing devices in buildings. The properties of VED are briefly described. The analytical studies of the model structures exhibiting the structural response reduction due to these viscoelastic devices are presented. Computer simulation of the damped response of a multi-storey steel frame structure shows significant reduction in floor displacement levels.Keywords: dampers, seismic evaluation, steel frames, viscoelastic
Procedia PDF Downloads 4841627 Virtual Reality Applications for Building Indoor Engineering: Circulation Way-Finding
Authors: Atefeh Omidkhah Kharashtomi, Rasoul Hedayat Nejad, Saeed Bakhtiyari
Abstract:
Circulation paths and indoor connection network of the building play an important role both in the daily operation of the building and during evacuation in emergency situations. The degree of legibility of the paths for navigation inside the building has a deep connection with the perceptive and cognitive system of human, and the way the surrounding environment is being perceived. Human perception of the space is based on the sensory systems in a three-dimensional environment, and non-linearly, so it is necessary to avoid reducing its representations in architectural design as a two-dimensional and linear issue. Today, the advances in the field of virtual reality (VR) technology have led to various applications, and architecture and building science can benefit greatly from these capabilities. Especially in cases where the design solution requires a detailed and complete understanding of the human perception of the environment and the behavioral response, special attention to VR technologies could be a priority. Way-finding in the indoor circulation network is a proper example for such application. Success in way-finding could be achieved if human perception of the route and the behavioral reaction have been considered in advance and reflected in the architectural design. This paper discusses the VR technology applications for the way-finding improvements in indoor engineering of the building. In a systematic review, with a database consisting of numerous studies, firstly, four categories for VR applications for circulation way-finding have been identified: 1) data collection of key parameters, 2) comparison of the effect of each parameter in virtual environment versus real world (in order to improve the design), 3) comparing experiment results in the application of different VR devices/ methods with each other or with the results of building simulation, and 4) training and planning. Since the costs of technical equipment and knowledge required to use VR tools lead to the limitation of its use for all design projects, priority buildings for the use of VR during design are introduced based on case-studies analysis. The results indicate that VR technology provides opportunities for designers to solve complex buildings design challenges in an effective and efficient manner. Then environmental parameters and the architecture of the circulation routes (indicators such as route configuration, topology, signs, structural and non-structural components, etc.) and the characteristics of each (metrics such as dimensions, proportions, color, transparency, texture, etc.) are classified for the VR way-finding experiments. Then, according to human behavior and reaction in the movement-related issues, the necessity of scenario-based and experiment design for using VR technology to improve the design and receive feedback from the test participants has been described. The parameters related to the scenario design are presented in a flowchart in the form of test design, data determination and interpretation, recording results, analysis, errors, validation and reporting. Also, the experiment environment design is discussed for equipment selection according to the scenario, parameters under study as well as creating the sense of illusion in the terms of place illusion, plausibility and illusion of body ownership.Keywords: virtual reality (VR), way-finding, indoor, circulation, design
Procedia PDF Downloads 741626 The Meaningful Pixel and Texture: Exploring Digital Vision and Art Practice Based on Chinese Cosmotechnics
Authors: Xingdu Wang, Charlie Gere, Emma Rose, Yuxuan Zhao
Abstract:
The study introduces a fresh perspective on the digital realm through an examination of the Chinese concept of Xiang, elucidating how it can build an understanding of pixels and textures on screens as digital trigrams. This concept attempts to offer an outlook on the intersection of digital technology and the natural world, thereby contributing to discussions about the harmonious relationship between humans and technology. The study looks for the ancient Chinese theory of Xiang as a key to establishing the theories and practices to respond to the problem of Contemporary Chinese technics. Xiang is a Chinese method of understanding the essentials of things through appearances, which differs from the method of science in the Westen. Xiang, the basement of Chinese visual art, is rooted in ancient Chinese philosophy and connected to the eight trigrams. The discussion of Xiang connects art, philosophy, and technology. This paper connects the meaning of Xiang with the 'truth appearing' philosophically through the analysis of the concepts of phenomenon and noumenon and the unique Chinese way of observing. Hereafter, the historical interconnection between ancient painting and writing in China emphasizes their relationship between technical craftsmanship and artistic expression. In digital, the paper blurs the traditional boundaries between images and text on digital screens in theory. Lastly, this study identified an ensemble concept relating to pixels and textures in computer vision, drawing inspiration from AI image recognition in Chinese paintings. In art practice, by presenting a fluid visual experience in the form of pixels, which mimics the flow of lines in traditional calligraphy and painting, it is hoped that the viewer will be brought back to the process of the truth appearing as defined by the 'Xiang’.Keywords: Chinese cosmotechnics, computer vision, contemporary Neo-Confucianism, texture and pixel, Xiang
Procedia PDF Downloads 641625 Using Audio-Visual Aids and Computer-Assisted Language Instruction to Overcome Learning Difficulties of Reading in Students of Special Needs
Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari
Abstract:
Background & aims: Reading is a receptive skill whose importance could involve abilities' variance from linguistic standard. Several evidences support the hypothesis stating that the more you read the better you write, with a different impact for speech language therapists (SLTs) who use audio-visual aids and computer-assisted language instruction (CALI) and those who do not. Methods: Here we made use of audio-visual aids and CALI for teaching reading skill to a group of 40 students of special needs of both sexes (range between 8 and 18 years old) at al-Malādh school for teaching students of special needs in Dhamar (Yemen) while another group of the same number is taught using ordinary teaching methods. Pre-and-posttests have been administered at the beginning and the end of the semester (Before and after teaching the reading course). The purpose was to understand the differences between the levels of the students of special needs to see to what extent audio-visual aids and CALI are useful for them. The two groups were taught by the same instructor under the same circumstances in the same school. Both quantitative and qualitative procedures were used to analyze the data. Results: The overall findings revealed that audio-visual aids and CALI are very useful for teaching reading to students of special needs and this can be seen in the scores of the treatment group’s subjects (7.0%, in post-test vs.2.5% in pre-test). In comparison to the scores of the second group’s subjects (where audio-visual aids and CALI were not used) (2.2% in both pre-and-posttests), the first group subjects have overcome reading tasks and this can be observed in their performance in the posttest. Compared with males, females’ performance was better (1466 scores (7.3%) vs. 1371 scores (6.8%). Qualitative and statistical analyses showed that such comprehension is absolutely due to the use of audio-visual aids and CALI and nothing else. These outcomes confirm the evidence of the significance of using audio-visual aids and CALI as effective means for teaching receptive skills in general and reading skill in particular.Keywords: reading, receptive skills, audio-visual aids, CALI, students, special needs, SLTs
Procedia PDF Downloads 491624 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology
Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani
Abstract:
Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography
Procedia PDF Downloads 4241623 Surveying Apps in Dam Excavation
Authors: Ali Mohammadi
Abstract:
Whenever there is a need to dig the ground, the presence of a surveyor is required to control the map. In projects such as dams and tunnels, these controls are more important because any mistakes can increase the cost. Also, time is great importance in These projects have and one of the ways to reduce the drilling time is to use techniques that can reduce the mapping time in these projects. Nowadays, with the existence of mobile phones, we can design apps that perform calculations and drawing for us on the mobile phone. Also, if we have a device that requires a computer to access its information, by designing an app, we can transfer its information to the mobile phone and use it, so we will not need to go to the office.Keywords: app, tunnel, excavation, dam
Procedia PDF Downloads 671622 Arc Plasma Thermochemical Preparation of Coal to Effective Combustion in Thermal Power Plants
Authors: Vladimir Messerle, Alexandr Ustimenko, Oleg Lavrichshev
Abstract:
This work presents plasma technology for solid fuel ignition and combustion. Plasma activation promotes more effective and environmentally friendly low-rank coal ignition and combustion. To realise this technology at coal fired power plants plasma-fuel systems (PFS) were developed. PFS improve efficiency of power coals combustion and decrease harmful emission. PFS is pulverized coal burner equipped with arc plasma torch. Plasma torch is the main element of the PFS. Plasma forming gas is air. It is blown through the electrodes forming plasma flame. Temperature of this flame is varied from 5000 to 6000 K. Plasma torch power is varied from 100 to 350 kW and geometrical sizes are the following: the height is 0.4-0.5 m and diameter is 0.2-0.25 m. The base of the PFS technology is plasma thermochemical preparation of coal for burning. It consists of heating of the pulverized coal and air mixture by arc plasma up to temperature of coal volatiles release and char carbon partial gasification. In the PFS coal-air mixture is deficient in oxygen and carbon is oxidised mainly to carbon monoxide. As a result, at the PFS exit a highly reactive mixture is formed of combustible gases and partially burned char particles, together with products of combustion, while the temperature of the gaseous mixture is around 1300 K. Further mixing with the air promotes intensive ignition and complete combustion of the prepared fuel. PFS have been tested for boilers start up and pulverized coal flame stabilization in different countries at power boilers of 75 to 950 t/h steam productivity. They were equipped with different types of pulverized coal burners (direct flow, muffle and swirl burners). At PFS testing power coals of all ranks (lignite, bituminous, anthracite and their mixtures) were incinerated. Volatile content of them was from 4 to 50%, ash varied from 15 to 48% and heat of combustion was from 1600 to 6000 kcal/kg. To show the advantages of the plasma technology before conventional technologies of coal combustion numerical investigation of plasma ignition, gasification and thermochemical preparation of a pulverized coal for incineration in an experimental furnace with heat capacity of 3 MW was fulfilled. Two computer-codes were used for the research. The computer simulation experiments were conducted for low-rank bituminous coal of 44% ash content. The boiler operation has been studied at the conventional mode of combustion and with arc plasma activation of coal combustion. The experiments and computer simulation showed ecological efficiency of the plasma technology. When a plasma torch operates in the regime of plasma stabilization of pulverized coal flame, NOX emission is reduced twice and amount of unburned carbon is reduced four times. Acknowledgement: This work was supported by Ministry of Education and Science of the Republic of Kazakhstan and Ministry of Education and Science of the Russian Federation (Agreement on grant No. 14.613.21.0005, project RFMEFI61314X0005).Keywords: coal, ignition, plasma-fuel system, plasma torch, thermal power plant
Procedia PDF Downloads 2781621 Procedure to Optimize the Performance of Chemical Laser Using the Genetic Algorithm Optimizations
Authors: Mohammedi Ferhate
Abstract:
This work presents details of the study of the entire flow inside the facility where the exothermic chemical reaction process in the chemical laser cavity is analyzed. In our paper we will describe the principles of chemical lasers where flow reversal is produced by chemical reactions. We explain the device for converting chemical potential energy laser energy. We see that the phenomenon thus has an explosive trend. Finally, the feasibility and effectiveness of the proposed method is demonstrated by computer simulationKeywords: genetic, lasers, nozzle, programming
Procedia PDF Downloads 941620 Awareness and Utilization of Social Network Tools among Agricultural Science Students in Colleges of Education in Ogun State, Nigeria
Authors: Adebowale Olukayode Efunnowo
Abstract:
This study was carried out to assess the awareness and utilization of Social Network Tools (SNTs) among agricultural science students in Colleges of Education in Ogun State, Nigeria. Simple random sampling techniques were used to select 280 respondents from the study area. Descriptive statistics was used to describe the objectives while Pearson Product Moment Correlation was used to test the hypothesis. The result showed that the majority (71.8%) of the respondents were single, with a mean age of 20 years. Almost all (95.7%) the respondents were aware of Facebook and 2go as a Social Network Tools (SNTs) while 85.0% of the respondents were not aware of Blackplanet, LinkedIn, MyHeritage and Bebo. Many (41.1%) of the respondents had views that using SNTs can enhance extensive literature survey, increase internet browsing potential, promote teaching proficiency, and update on outcomes of researches. However, 51.4% of the respondents perceived that SNTs usage as what is meant for the lecturers/adults only while 16.1% considered it as mainly used by internet fraudsters. Findings revealed that about 50.0% of the respondents browsed Facebook and 2go daily while more than 80% of the respondents used Blackplanet, MyHeritage, Skyrock, Bebo, LinkedIn and My YearBook as the need arise. Major constraints to the awareness and utilization of SNTs were high cost and poor quality of ICTs facilities (77.1%), epileptic power supply (75.0%), inadequate telecommunication infrastructure (71.1%), low technical know-how (62.9%) and inadequate computer knowledge (61.1%). The result of PPMC analysis showed that there was an inverse relationship between constraints and utilization of SNTs at p < 0.05. It can be concluded that constraints affect efficient and effective utilization of SNTs in the study area. It is hereby recommended that management of colleges of education and agricultural institutes should provide good internet connectivity, computer facilities, and alternative power supply in order to increase the awareness and utilization of SNTs among students.Keywords: awareness, utilization, social network tools, constraints, students
Procedia PDF Downloads 3521619 Improving Lane Detection for Autonomous Vehicles Using Deep Transfer Learning
Authors: Richard O’Riordan, Saritha Unnikrishnan
Abstract:
Autonomous Vehicles (AVs) are incorporating an increasing number of ADAS features, including automated lane-keeping systems. In recent years, many research papers into lane detection algorithms have been published, varying from computer vision techniques to deep learning methods. The transition from lower levels of autonomy defined in the SAE framework and the progression to higher autonomy levels requires increasingly complex models and algorithms that must be highly reliable in their operation and functionality capacities. Furthermore, these algorithms have no room for error when operating at high levels of autonomy. Although the current research details existing computer vision and deep learning algorithms and their methodologies and individual results, the research also details challenges faced by the algorithms and the resources needed to operate, along with shortcomings experienced during their detection of lanes in certain weather and lighting conditions. This paper will explore these shortcomings and attempt to implement a lane detection algorithm that could be used to achieve improvements in AV lane detection systems. This paper uses a pre-trained LaneNet model to detect lane or non-lane pixels using binary segmentation as the base detection method using an existing dataset BDD100k followed by a custom dataset generated locally. The selected roads will be modern well-laid roads with up-to-date infrastructure and lane markings, while the second road network will be an older road with infrastructure and lane markings reflecting the road network's age. The performance of the proposed method will be evaluated on the custom dataset to compare its performance to the BDD100k dataset. In summary, this paper will use Transfer Learning to provide a fast and robust lane detection algorithm that can handle various road conditions and provide accurate lane detection.Keywords: ADAS, autonomous vehicles, deep learning, LaneNet, lane detection
Procedia PDF Downloads 1041618 Genetically Engineered Crops: Solution for Biotic and Abiotic Stresses in Crop Production
Authors: Deepak Loura
Abstract:
Production and productivity of several crops in the country continue to be adversely affected by biotic (e.g., Insect-pests and diseases) and abiotic (e.g., water temperature and salinity) stresses. Over-dependence on pesticides and other chemicals is economically non-viable for the resource-poor farmers of our country. Further, pesticides can potentially affect human and environmental safety. While traditional breeding techniques and proper- management strategies continue to play a vital role in crop improvement, we need to judiciously use biotechnology approaches for the development of genetically modified crops addressing critical problems in the improvement of crop plants for sustainable agriculture. Modern biotechnology can help to increase crop production, reduce farming costs, and improve food quality and the safety of the environment. Genetic engineering is a new technology which allows plant breeders to produce plants with new gene combinations by genetic transformation of crop plants for improvement of agronomic traits. Advances in recombinant DNA technology have made it possible to have genes between widely divergent species to develop genetically modified or genetically engineered plants. Plant genetic engineering provides the strength to harness useful genes and alleles from indigenous microorganisms to enrich the gene pool for developing genetically modified (GM) crops that will have inbuilt (inherent) resistance to insect pests, diseases, and abiotic stresses. Plant biotechnology has made significant contributions in the past 20 years in the development of genetically engineered or genetically modified crops with multiple benefits. A variety of traits have been introduced in genetically engineered crops which include (i) herbicide resistance. (ii) pest resistance, (iii) viral resistance, (iv) slow ripening of fruits and vegetables, (v) fungal and bacterial resistance, (vi) abiotic stress tolerance (drought, salinity, temperature, flooding, etc.). (vii) quality improvement (starch, protein, and oil), (viii) value addition (vitamins, micro, and macro elements), (ix) pharmaceutical and therapeutic proteins, and (x) edible vaccines, etc. Multiple genes in transgenic crops can be useful in developing durable disease resistance and a broad insect-control spectrum and could lead to potential cost-saving advantages for farmers. The development of transgenic to produce high-value pharmaceuticals and the edible vaccine is also under progress, which requires much more research and development work before commercially viable products will be available. In addition, molecular-aided selection (MAS) is now routinely used to enhance the speed and precision of plant breeding. Newer technologies need to be developed and deployed for enhancing and sustaining agricultural productivity. There is a need to optimize the use of biotechnology in conjunction with conventional technologies to achieve higher productivity with fewer resources. Therefore, genetic modification/ engineering of crop plants assumes greater importance, which demands the development and adoption of newer technology for the genetic improvement of crops for increasing crop productivity.Keywords: biotechnology, plant genetic engineering, genetically modified, biotic, abiotic, disease resistance
Procedia PDF Downloads 711617 Impact of PV Distributed Generation on Loop Distribution Network at Saudi Electricity Company Substation in Riyadh City
Authors: Mohammed Alruwaili
Abstract:
Nowadays, renewable energy resources are playing an important role in replacing traditional energy resources such as fossil fuels by integrating solar energy with conventional energy. Concerns about the environment led to an intensive search for a renewable energy source. The Rapid growth of distributed energy resources will have prompted increasing interest in the integrated distributing network in the Kingdom of Saudi Arabia next few years, especially after the adoption of new laws and regulations in this regard. Photovoltaic energy is one of the promising renewable energy sources that has grown rapidly worldwide in the past few years and can be used to produce electrical energy through the photovoltaic process. The main objective of the research is to study the impact of PV in distribution networks based on real data and details. In this research, site survey and computer simulation will be dealt with using the well-known computer program software ETAB to simulate the input of electrical distribution lines with other variable inputs such as the levels of solar radiation and the field study that represent the prevailing conditions and conditions in Diriah, Riyadh region, Saudi Arabia. In addition, the impact of adding distributed generation units (DGs) to the distribution network, including solar photovoltaic (PV), will be studied and assessed for the impact of adding different power capacities. The result has been achieved with less power loss in the loop distribution network from the current condition by more than 69% increase in network power loss. However, the studied network contains 78 buses. It is hoped from this research that the efficiency, performance, quality and reliability by having an enhancement in power loss and voltage profile of the distribution networks in Riyadh City. Simulation results prove that the applied method can illustrate the positive impact of PV in loop distribution generation.Keywords: renewable energy, smart grid, efficiency, distribution network
Procedia PDF Downloads 1401616 Lexical Semantic Analysis to Support Ontology Modeling of Maintenance Activities– Case Study of Offshore Riser Integrity
Authors: Vahid Ebrahimipour
Abstract:
Word representation and context meaning of text-based documents play an essential role in knowledge modeling. Business procedures written in natural language are meant to store technical and engineering information, management decision and operation experience during the production system life cycle. Context meaning representation is highly dependent upon word sense, lexical relativity, and sematic features of the argument. This paper proposes a method for lexical semantic analysis and context meaning representation of maintenance activity in a mass production system. Our approach constructs a straightforward lexical semantic approach to analyze facilitates semantic and syntactic features of context structure of maintenance report to facilitate translation, interpretation, and conversion of human-readable interpretation into computer-readable representation and understandable with less heterogeneity and ambiguity. The methodology will enable users to obtain a representation format that maximizes shareability and accessibility for multi-purpose usage. It provides a contextualized structure to obtain a generic context model that can be utilized during the system life cycle. At first, it employs a co-occurrence-based clustering framework to recognize a group of highly frequent contextual features that correspond to a maintenance report text. Then the keywords are identified for syntactic and semantic extraction analysis. The analysis exercises causality-driven logic of keywords’ senses to divulge the structural and meaning dependency relationships between the words in a context. The output is a word contextualized representation of maintenance activity accommodating computer-based representation and inference using OWL/RDF.Keywords: lexical semantic analysis, metadata modeling, contextual meaning extraction, ontology modeling, knowledge representation
Procedia PDF Downloads 1051615 Magnetic Solid-Phase Separation of Uranium from Aqueous Solution Using High Capacity Diethylenetriamine Tethered Magnetic Adsorbents
Authors: Amesh P, Suneesh A S, Venkatesan K A
Abstract:
The magnetic solid-phase extraction is a relatively new method among the other solid-phase extraction techniques for the separating of metal ions from aqueous solutions, such as mine water and groundwater, contaminated wastes, etc. However, the bare magnetic particles (Fe3O4) exhibit poor selectivity due to the absence of target-specific functional groups for sequestering the metal ions. The selectivity of these magnetic particles can be remarkably improved by covalently tethering the task-specific ligands on magnetic surfaces. The magnetic particles offer a number of advantages such as quick phase separation aided by the external magnetic field. As a result, the solid adsorbent can be prepared with the particle size ranging from a few micrometers to the nanometer, which again offers the advantages such as enhanced kinetics of extraction, higher extraction capacity, etc. Conventionally, the magnetite (Fe3O4) particles were prepared by the hydrolysis and co-precipitation of ferrous and ferric salts in aqueous ammonia solution. Since the covalent linking of task-specific functionalities on Fe3O4 was difficult, and it is also susceptible to redox reaction in the presence of acid or alkali, it is necessary to modify the surface of Fe3O4 by silica coating. This silica coating is usually carried out by hydrolysis and condensation of tetraethyl orthosilicate over the surface of magnetite to yield a thin layer of silica-coated magnetite particles. Since the silica-coated magnetite particles amenable for further surface modification, it can be reacted with task-specific functional groups to obtain the functionalized magnetic particles. The surface area exhibited by such magnetic particles usually falls in the range of 50 to 150 m2.g-1, which offer advantage such as quick phase separation, as compared to the other solid-phase extraction systems. In addition, the magnetic (Fe3O4) particles covalently linked on mesoporous silica matrix (MCM-41) and task-specific ligands offer further advantages in terms of extraction kinetics, high stability, longer reusable cycles, and metal extraction capacity, due to the large surface area, ample porosity and enhanced number of functional groups per unit area on these adsorbents. In view of this, the present paper deals with the synthesis of uranium specific diethylenetriamine ligand (DETA) ligand anchored on silica-coated magnetite (Fe-DETA) as well as on magnetic mesoporous silica (MCM-Fe-DETA) and studies on the extraction of uranium from aqueous solution spiked with uranium to mimic the mine water or groundwater contaminated with uranium. The synthesized solid-phase adsorbents were characterized by FT-IR, Raman, TG-DTA, XRD, and SEM. The extraction behavior of uranium on the solid-phase was studied under several conditions like the effect of pH, initial concentration of uranium, rate of extraction and its variation with pH and initial concentration of uranium, effect of interference ions like CO32-, Na+, Fe+2, Ni+2, and Cr+3, etc. The maximum extraction capacity of 233 mg.g-1 was obtained for Fe-DETA, and a huge capacity of 1047 mg.g-1 was obtained for MCM-Fe-DETA. The mechanism of extraction, speciation of uranium, extraction studies, reusability, and the other results obtained in the present study suggests Fe-DETA and MCM-Fe-DETA are the potential candidates for the extraction of uranium from mine water, and groundwater.Keywords: diethylenetriamine, magnetic mesoporous silica, magnetic solid-phase extraction, uranium extraction, wastewater treatment
Procedia PDF Downloads 1681614 Parallel Multisplitting Methods for Differential Systems
Authors: Malika El Kyal, Ahmed Machmoum
Abstract:
We prove the superlinear convergence of asynchronous multi-splitting methods applied to differential equations. This study is based on the technique of nested sets. It permits to specify kind of the convergence in the asynchronous mode.The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.Keywords: parallel methods, asynchronous mode, multisplitting, ODE
Procedia PDF Downloads 5261613 High Performance Computing and Big Data Analytics
Authors: Branci Sarra, Branci Saadia
Abstract:
Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.Keywords: high performance computing, HPC, big data, data analysis
Procedia PDF Downloads 5201612 Development of Knowledge Discovery Based Interactive Decision Support System on Web Platform for Maternal and Child Health System Strengthening
Authors: Partha Saha, Uttam Kumar Banerjee
Abstract:
Maternal and Child Healthcare (MCH) has always been regarded as one of the important issues globally. Reduction of maternal and child mortality rates and increase of healthcare service coverage were declared as one of the targets in Millennium Development Goals till 2015 and thereafter as an important component of the Sustainable Development Goals. Over the last decade, worldwide MCH indicators have improved but could not match the expected levels. Progress of both maternal and child mortality rates have been monitored by several researchers. Each of the studies has stated that only less than 26% of low-income and middle income countries (LMICs) were on track to achieve targets as prescribed by MDG4. Average worldwide annual rate of reduction of under-five mortality rate and maternal mortality rate were 2.2% and 1.9% as on 2011 respectively whereas rates should be minimum 4.4% and 5.5% annually to achieve targets. In spite of having proven healthcare interventions for both mothers and children, those could not be scaled up to the required volume due to fragmented health systems, especially in the developing and under-developed countries. In this research, a knowledge discovery based interactive Decision Support System (DSS) has been developed on web platform which would assist healthcare policy makers to develop evidence-based policies. To achieve desirable results in MCH, efficient resource planning is very much required. In maximum LMICs, resources are big constraint. Knowledge, generated through this system, would help healthcare managers to develop strategic resource planning for combatting with issues like huge inequity and less coverage in MCH. This system would help healthcare managers to accomplish following four tasks. Those are a) comprehending region wise conditions of variables related with MCH, b) identifying relationships within variables, c) segmenting regions based on variables status, and d) finding out segment wise key influential variables which have major impact on healthcare indicators. Whole system development process has been divided into three phases. Those were i) identifying contemporary issues related with MCH services and policy making; ii) development of the system; and iii) verification and validation of the system. More than 90 variables under three categories, such as a) educational, social, and economic parameters; b) MCH interventions; and c) health system building blocks have been included into this web-based DSS and five separate modules have been developed under the system. First module has been designed for analysing current healthcare scenario. Second module would help healthcare managers to understand correlations among variables. Third module would reveal frequently-occurring incidents along with different MCH interventions. Fourth module would segment regions based on previously mentioned three categories and in fifth module, segment-wise key influential interventions will be identified. India has been considered as case study area in this research. Data of 601 districts of India has been used for inspecting effectiveness of those developed modules. This system has been developed by importing different statistical and data mining techniques on Web platform. Policy makers would be able to generate different scenarios from the system before drawing any inference, aided by its interactive capability.Keywords: maternal and child heathcare, decision support systems, data mining techniques, low and middle income countries
Procedia PDF Downloads 2581611 Scheduling Residential Daily Energy Consumption Using Bi-criteria Optimization Methods
Authors: Li-hsing Shih, Tzu-hsun Yen
Abstract:
Because of the long-term commitment to net zero carbon emission, utility companies include more renewable energy supply, which generates electricity with time and weather restrictions. This leads to time-of-use electricity pricing to reflect the actual cost of energy supply. From an end-user point of view, better residential energy management is needed to incorporate the time-of-use prices and assist end users in scheduling their daily use of electricity. This study uses bi-criteria optimization methods to schedule daily energy consumption by minimizing the electricity cost and maximizing the comfort of end users. Different from most previous research, this study schedules users’ activities rather than household appliances to have better measures of users’ comfort/satisfaction. The relation between each activity and the use of different appliances could be defined by users. The comfort level is at the highest when the time and duration of an activity completely meet the user’s expectation, and the comfort level decreases when the time and duration do not meet expectations. A questionnaire survey was conducted to collect data for establishing regression models that describe users’ comfort levels when the execution time and duration of activities are different from user expectations. Six regression models representing the comfort levels for six types of activities were established using the responses to the questionnaire survey. A computer program is developed to evaluate electricity cost and the comfort level for each feasible schedule and then find the non-dominated schedules. The Epsilon constraint method is used to find the optimal schedule out of the non-dominated schedules. A hypothetical case is presented to demonstrate the effectiveness of the proposed approach and the computer program. Using the program, users can obtain the optimal schedule of daily energy consumption by inputting the intended time and duration of activities and the given time-of-use electricity prices.Keywords: bi-criteria optimization, energy consumption, time-of-use price, scheduling
Procedia PDF Downloads 601610 Image Based Landing Solutions for Large Passenger Aircraft
Authors: Thierry Sammour Sawaya, Heikki Deschacht
Abstract:
In commercial aircraft operations, almost half of the accidents happen during approach or landing phases. Automatic guidance and automatic landings have proven to bring significant safety value added for this challenging landing phase. This is why Airbus and ScioTeq have decided to work together to explore the capability of image-based landing solutions as additional landing aids to further expand the possibility to perform automatic approach and landing to runways where the current guiding systems are either not fitted or not optimum. Current systems for automated landing often depend on radio signals provided by airport ground infrastructure on the airport or satellite coverage. In addition, these radio signals may not always be available with the integrity and performance required for safe automatic landing. Being independent from these radio signals would widen the operations possibilities and increase the number of automated landings. Airbus and ScioTeq are joining their expertise in the field of Computer Vision in the European Program called Clean Sky 2 Large Passenger Aircraft, in which they are leading the IMBALS (IMage BAsed Landing Solutions) project. The ultimate goal of this project is to demonstrate, develop, validate and verify a certifiable automatic landing system guiding an airplane during the approach and landing phases based on an onboard camera system capturing images, enabling automatic landing independent from radio signals and without precision instrument for landing. In the frame of this project, ScioTeq is responsible for the development of the Image Processing Platform (IPP), while Airbus is responsible for defining the functional and system requirements as well as the testing and integration of the developed equipment in a Large Passenger Aircraft representative environment. The aim of this paper will be to describe the system as well as the associated methods and tools developed for validation and verification.Keywords: aircraft landing system, aircraft safety, autoland, avionic system, computer vision, image processing
Procedia PDF Downloads 1011609 Protected Cultivation of Horticultural Crops: Increases Productivity per Unit of Area and Time
Authors: Deepak Loura
Abstract:
The most contemporary method of producing horticulture crops both qualitatively and quantitatively is protected cultivation, or greenhouse cultivation, which has gained widespread acceptance in recent decades. Protected farming, commonly referred to as controlled environment agriculture (CEA), is extremely productive, land- and water-wise, as well as environmentally friendly. The technology entails growing horticulture crops in a controlled environment where variables such as temperature, humidity, light, soil, water, fertilizer, etc. are adjusted to achieve optimal output and enable a consistent supply of them even during the off-season. Over the past ten years, protected cultivation of high-value crops and cut flowers has demonstrated remarkable potential. More and more agricultural and horticultural crop production systems are moving to protected environments as a result of the growing demand for high-quality products by global markets. By covering the crop, it is possible to control the macro- and microenvironments, enhancing plant performance and allowing for longer production times, earlier harvests, and higher yields of higher quality. These shielding features alter the environment of the plant while also offering protection from wind, rain, and insects. Protected farming opens up hitherto unexplored opportunities in agriculture as the liberalised economy and improved agricultural technologies advance. Typically, the revenues from fruit, vegetable, and flower crops are 4 to 8 times higher than those from other crops. If any of these high-value crops are cultivated in protected environments like greenhouses, net houses, tunnels, etc., this profit can be multiplied. Vegetable and cut flower post-harvest losses are extremely high (20–0%), however sheltered growing techniques and year-round cropping can greatly minimize post-harvest losses and enhance yield by 5–10 times. Seasonality and weather have a big impact on the production of vegetables and flowers. The variety of their products results in significant price and quality changes for vegetables. For the application of current technology in crop production, achieving a balance between year-round availability of vegetables and flowers with minimal environmental impact and remaining competitive is a significant problem. The future of agriculture will be protected since population growth is reducing the amount of land that may be held. Protected agriculture is a particularly profitable endeavor for tiny landholdings. Small greenhouses, net houses, nurseries, and low tunnel greenhouses can all be built by farmers to increase their income. Protected agriculture is also aided by the rise in biotic and abiotic stress factors. As a result of the greater productivity levels, these technologies are not only opening up opportunities for producers with larger landholdings, but also for those with smaller holdings. Protected cultivation can be thought of as a kind of precise, forward-thinking, parallel agriculture that covers almost all aspects of farming and is rather subject to additional inspection for technical applicability to circumstances, farmer economics, and market economics.Keywords: protected cultivation, horticulture, greenhouse, vegetable, controlled environment agriculture
Procedia PDF Downloads 761608 Seashore Debris Detection System Using Deep Learning and Histogram of Gradients-Extractor Based Instance Segmentation Model
Authors: Anshika Kankane, Dongshik Kang
Abstract:
Marine debris has a significant influence on coastal environments, damaging biodiversity, and causing loss and damage to marine and ocean sector. A functional cost-effective and automatic approach has been used to look up at this problem. Computer vision combined with a deep learning-based model is being proposed to identify and categorize marine debris of seven kinds on different beach locations of Japan. This research compares state-of-the-art deep learning models with a suggested model architecture that is utilized as a feature extractor for debris categorization. The model is being proposed to detect seven categories of litter using a manually constructed debris dataset, with the help of Mask R-CNN for instance segmentation and a shape matching network called HOGShape, which can then be cleaned on time by clean-up organizations using warning notifications of the system. The manually constructed dataset for this system is created by annotating the images taken by fixed KaKaXi camera using CVAT annotation tool with seven kinds of category labels. A pre-trained HOG feature extractor on LIBSVM is being used along with multiple templates matching on HOG maps of images and HOG maps of templates to improve the predicted masked images obtained via Mask R-CNN training. This system intends to timely alert the cleanup organizations with the warning notifications using live recorded beach debris data. The suggested network results in the improvement of misclassified debris masks of debris objects with different illuminations, shapes, viewpoints and litter with occlusions which have vague visibility.Keywords: computer vision, debris, deep learning, fixed live camera images, histogram of gradients feature extractor, instance segmentation, manually annotated dataset, multiple template matching
Procedia PDF Downloads 1061607 Reinforcement Learning For Agile CNC Manufacturing: Optimizing Configurations And Sequencing
Authors: Huan Ting Liao
Abstract:
In a typical manufacturing environment, computer numerical control (CNC) machining is essential for automating production through precise computer-controlled tool operations, significantly enhancing efficiency and ensuring consistent product quality. However, traditional CNC production lines often rely on manual loading and unloading, limiting operational efficiency and scalability. Although automated loading systems have been developed, they frequently lack sufficient intelligence and configuration efficiency, requiring extensive setup adjustments for different products and impacting overall productivity. This research addresses the job shop scheduling problem (JSSP) in CNC machining environments, aiming to minimize total completion time (makespan) and maximize CNC machine utilization. We propose a novel approach using reinforcement learning (RL), specifically the Q-learning algorithm, to optimize scheduling decisions. The study simulates the JSSP, incorporating robotic arm operations, machine processing times, and work order demand allocation to determine optimal processing sequences. The Q-learning algorithm enhances machine utilization by dynamically balancing workloads across CNC machines, adapting to varying job demands and machine states. This approach offers robust solutions for complex manufacturing environments by automating decision-making processes for job assignments. Additionally, we evaluate various layout configurations to identify the most efficient setup. By integrating RL-based scheduling optimization with layout analysis, this research aims to provide a comprehensive solution for improving manufacturing efficiency and productivity in CNC-based job shops. The proposed method's adaptability and automation potential promise significant advancements in tackling dynamic manufacturing challenges.Keywords: job shop scheduling problem, reinforcement learning, operations sequence, layout optimization, q-learning
Procedia PDF Downloads 241606 Ethical Issues around Online Marketing to Children
Authors: Chris Preston
Abstract:
As we devise ever more sophisticated methods of on-line marketing, devising systems that are able to reach into the everyday lives of consumers, we are confronted by a generation of children who face unprecedented intervention by commercial organisations into young minds, via electronic devices, and whether by computer, tablet or phone, such children have been somehow reduced to the status of their devices, with little regard for their well being as individuals. This discussion paper seeks to draw attention to such practice and questions the ethics of digital marketing methods.Keywords: online marketing to children, online research of children, online targeting of children, consumer rights, ethics
Procedia PDF Downloads 3921605 Resisting Adversarial Assaults: A Model-Agnostic Autoencoder Solution
Authors: Massimo Miccoli, Luca Marangoni, Alberto Aniello Scaringi, Alessandro Marceddu, Alessandro Amicone
Abstract:
The susceptibility of deep neural networks (DNNs) to adversarial manipulations is a recognized challenge within the computer vision domain. Adversarial examples, crafted by adding subtle yet malicious alterations to benign images, exploit this vulnerability. Various defense strategies have been proposed to safeguard DNNs against such attacks, stemming from diverse research hypotheses. Building upon prior work, our approach involves the utilization of autoencoder models. Autoencoders, a type of neural network, are trained to learn representations of training data and reconstruct inputs from these representations, typically minimizing reconstruction errors like mean squared error (MSE). Our autoencoder was trained on a dataset of benign examples; learning features specific to them. Consequently, when presented with significantly perturbed adversarial examples, the autoencoder exhibited high reconstruction errors. The architecture of the autoencoder was tailored to the dimensions of the images under evaluation. We considered various image sizes, constructing models differently for 256x256 and 512x512 images. Moreover, the choice of the computer vision model is crucial, as most adversarial attacks are designed with specific AI structures in mind. To mitigate this, we proposed a method to replace image-specific dimensions with a structure independent of both dimensions and neural network models, thereby enhancing robustness. Our multi-modal autoencoder reconstructs the spectral representation of images across the red-green-blue (RGB) color channels. To validate our approach, we conducted experiments using diverse datasets and subjected them to adversarial attacks using models such as ResNet50 and ViT_L_16 from the torch vision library. The autoencoder extracted features used in a classification model, resulting in an MSE (RGB) of 0.014, a classification accuracy of 97.33%, and a precision of 99%.Keywords: adversarial attacks, malicious images detector, binary classifier, multimodal transformer autoencoder
Procedia PDF Downloads 112