Search results for: explanations for the probable causes of the errors
1039 Resisting Adversarial Assaults: A Model-Agnostic Autoencoder Solution
Authors: Massimo Miccoli, Luca Marangoni, Alberto Aniello Scaringi, Alessandro Marceddu, Alessandro Amicone
Abstract:
The susceptibility of deep neural networks (DNNs) to adversarial manipulations is a recognized challenge within the computer vision domain. Adversarial examples, crafted by adding subtle yet malicious alterations to benign images, exploit this vulnerability. Various defense strategies have been proposed to safeguard DNNs against such attacks, stemming from diverse research hypotheses. Building upon prior work, our approach involves the utilization of autoencoder models. Autoencoders, a type of neural network, are trained to learn representations of training data and reconstruct inputs from these representations, typically minimizing reconstruction errors like mean squared error (MSE). Our autoencoder was trained on a dataset of benign examples; learning features specific to them. Consequently, when presented with significantly perturbed adversarial examples, the autoencoder exhibited high reconstruction errors. The architecture of the autoencoder was tailored to the dimensions of the images under evaluation. We considered various image sizes, constructing models differently for 256x256 and 512x512 images. Moreover, the choice of the computer vision model is crucial, as most adversarial attacks are designed with specific AI structures in mind. To mitigate this, we proposed a method to replace image-specific dimensions with a structure independent of both dimensions and neural network models, thereby enhancing robustness. Our multi-modal autoencoder reconstructs the spectral representation of images across the red-green-blue (RGB) color channels. To validate our approach, we conducted experiments using diverse datasets and subjected them to adversarial attacks using models such as ResNet50 and ViT_L_16 from the torch vision library. The autoencoder extracted features used in a classification model, resulting in an MSE (RGB) of 0.014, a classification accuracy of 97.33%, and a precision of 99%.Keywords: adversarial attacks, malicious images detector, binary classifier, multimodal transformer autoencoder
Procedia PDF Downloads 1111038 A Modelling Analysis of Monetary Policy Rule
Authors: Wael Bakhit, Salma Bakhit
Abstract:
This paper employs a quarterly time series to determine the timing of structural breaks for interest rates in USA over the last 60 years. The Chow test is used for investigating the non-stationary, where the date of the potential break is assumed to be known. Moreover, an empirical examination of the financial sector was made to check if it is positively related to deviations from an assumed interest rate as given in a standard Taylor rule. The empirical analysis is strengthened by analysing the rule from a historical perspective and a look at the effect of setting the interest rate by the central bank on financial imbalances. The empirical evidence indicates that deviation in monetary policy has a potential causal factor in the build-up of financial imbalances and the subsequent crisis where macro prudential intervention could have beneficial effect. Thus, our findings tend to support the view which states that the probable existence of central banks has been a source of global financial crisis since the past decade.Keywords: Taylor rule, financial imbalances, central banks, econometrics
Procedia PDF Downloads 3851037 Evaluation of Correct Usage, Comfort and Fit of Personal Protective Equipment in Construction Work
Authors: Anna-Lisa Osvalder, Jonas Borell
Abstract:
There are several reasons behind the use, non-use, or inadequate use of personal protective equipment (PPE) in the construction industry. Comfort and accurate size support proper use, while discomfort, misfit, and difficulties to understand how the PPEs should be handled inhibit correct usage. The need for several protective equipments simultaneously might also create problems. The purpose of this study was to analyse the correct usage, comfort, and fit of different types of PPEs used for construction work. Correct usage was analysed as guessability, i.e., human perceptions of how to don, adjust, use, and doff the equipment, and if used as intended. The PPEs tested individually or in combinations were a helmet, ear protectors, goggles, respiratory masks, gloves, protective cloths, and safety harnesses. First, an analytical evaluation was performed with ECW (enhanced cognitive walkthrough) and PUEA (predictive use error analysis) to search for usability problems and use errors during handling and use. Then usability tests were conducted to evaluate guessability, comfort, and fit with 10 test subjects of different heights and body constitutions. The tests included observations during donning, five different outdoor work tasks, and doffing. The think-aloud method, short interviews, and subjective estimations were performed. The analytical evaluation showed that some usability problems and use errors arise during donning and doffing, but with minor severity, mostly causing discomfort. A few use errors and usability problems arose for the safety harness, especially for novices, where some could lead to a high risk of severe incidents. The usability tests showed that discomfort arose for all test subjects when using a combination of PPEs, increasing over time. For instance, goggles, together with the face mask, caused pressure, chafing at the nose, and heat rash on the face. This combination also limited sight of vision. The helmet, in combination with the goggles and ear protectors, did not fit well and caused uncomfortable pressure at the temples. No major problems were found with the individual fit of the PPEs. The ear protectors, goggles, and face masks could be adjusted for different head sizes. The guessability for how to don and wear the combination of PPE was moderate, but it took some time to adjust them for a good fit. The guessability was poor for the safety harness; few clues in the design showed how it should be donned, adjusted, or worn on the skeletal bones. Discomfort occurred when the straps were tightened too much. All straps could not be adjusted for somebody's constitutions leading to non-optimal safety. To conclude, if several types of PPEs are used together, discomfort leading to pain is likely to occur over time, which can lead to misuse, non-use, or reduced performance. If people who are not regular users should wear a safety harness correctly, the design needs to be improved for easier interpretation, correct position of the straps, and increased possibilities for individual adjustments. The results from this study can be a base for re-design ideas for PPE, especially when they should be used in combinations.Keywords: construction work, PPE, personal protective equipment, misuse, guessability, usability
Procedia PDF Downloads 851036 An Approach to Solving Some Inverse Problems for Parabolic Equations
Authors: Bolatbek Rysbaiuly, Aliya S. Azhibekova
Abstract:
Problems concerning the interpretation of the well testing results belong to the class of inverse problems of subsurface hydromechanics. The distinctive feature of such problems is that additional information is depending on the capabilities of oilfield experiments. Another factor that should not be overlooked is the existence of errors in the test data. To determine reservoir properties, some inverse problems for parabolic equations were investigated. An approach to solving the inverse problems based on the method of regularization is proposed.Keywords: iterative approach, inverse problem, parabolic equation, reservoir properties
Procedia PDF Downloads 4261035 Proof of Concept Design and Development of a Computer-Aided Medical Evaluation of Symptoms Web App: An Expert System for Medical Diagnosis in General Practice
Authors: Ananda Perera
Abstract:
Computer-Assisted Medical Evaluation of Symptoms (CAMEOS) is a medical expert system designed to help General Practices (GPs) make an accurate diagnosis. CAMEOS comprises a knowledge base, user input, inference engine, reasoning module, and output statement. The knowledge base was developed by the author. User input is an Html file. The physician user collects data in the consultation. Data is sent to the inference engine at servers. CAMEOS uses set theory to simulate diagnostic reasoning. The program output is a list of differential diagnoses, the most probable diagnosis, and the diagnostic reasoning.Keywords: CDSS, computerized decision support systems, expert systems, general practice, diagnosis, diagnostic systems, primary care diagnostic system, artificial intelligence in medicine
Procedia PDF Downloads 1541034 Consideration of Magnetic Lines of Force as Magnets Produced by Percussion Waves
Authors: Angel Pérez Sánchez
Abstract:
Background: Consider magnetic lines of force as a vector magnetic current was introduced by convention around 1830. But this leads to a dead end in traditional physics, and quantum explanations must be referred to explain the magnetic phenomenon. However, a study of magnetic lines as percussive waves leads to other paths capable of interpreting magnetism through traditional physics. Methodology: Brick used in the experiment: two parallel electric current cables attract each other if current goes in the same direction and its application at a microscopic level inside magnets. Significance: Consideration of magnetic lines as magnets themselves would mean a paradigm shift in the study of magnetism and open the way to provide solutions to mysteries of magnetism until now only revealed by quantum mechanics. Major findings: discover how a magnetic field is created, as well as reason how magnetic attraction and repulsion work, understand how magnets behave when splitting them, and reveal the impossibility of a Magnetic Monopole. All of this is presented as if it were a symphony in which all the notes fit together perfectly to create a beautiful, smart, and simple work.Keywords: magnetic lines of force, magnetic field, magnetic attraction and repulsion, magnet split, magnetic monopole, magnetic lines of force as magnets, magnetic lines of force as waves
Procedia PDF Downloads 881033 Investigating Links in Achievement and Deprivation (ILiAD): A Case Study Approach to Community Differences
Authors: Ruth Leitch, Joanne Hughes
Abstract:
This paper presents the findings of a three-year government-funded study (ILiAD) that aimed to understand the reasons for differential educational achievement within and between socially and economically deprived areas in Northern Ireland. Previous international studies have concluded that there is a positive correlation between deprivation and underachievement. Our preliminary secondary data analysis suggested that the factors involved in educational achievement within multiple deprived areas may be more complex than this, with some areas of high multiple deprivation having high levels of student attainment, whereas other less deprived areas demonstrated much lower levels of student attainment, as measured by outcomes on high stakes national tests. The study proposed that no single explanation or disparate set of explanations could easily account for the linkage between levels of deprivation and patterns of educational achievement. Using a social capital perspective that centralizes the connections within and between individuals and social networks in a community as a valuable resource for educational achievement, the ILiAD study involved a multi-level case study analysis of seven community sites in Northern Ireland, selected on the basis of religious composition (housing areas are largely segregated by religious affiliation), measures of multiple deprivation and differentials in educational achievement. The case study approach involved three (interconnecting) levels of qualitative data collection and analysis - what we have termed Micro (or community/grassroots level) understandings, Meso (or school level) explanations and Macro (or policy/structural) factors. The analysis combines a statistical mapping of factors with qualitative, in-depth data interpretation which, together, allow for deeper understandings of the dynamics and contributory factors within and between the case study sites. Thematic analysis of the qualitative data reveals both cross-cutting factors (e.g. demographic shifts and loss of community, place of the school in the community, parental capacity) and analytic case studies of explanatory factors associated with each of the community sites also permit a comparative element. Issues arising from the qualitative analysis are classified either as drivers or inhibitors of educational achievement within and between communities. Key issues that are emerging as inhibitors/drivers to attainment include: the legacy of the community conflict in Northern Ireland, not least in terms of inter-generational stress, related with substance abuse and mental health issues; differing discourses on notions of ‘community’ and ‘achievement’ within/between community sites; inter-agency and intra-agency levels of collaboration and joined-up working; relationship between the home/school/community triad and; school leadership and school ethos. At this stage, the balance of these factors can be conceptualized in terms of bonding social capital (or lack of it) within families, within schools, within each community, within agencies and also bridging social capital between the home/school/community, between different communities and between key statutory and voluntary organisations. The presentation will outline the study rationale, its methodology, present some cross-cutting findings and use an illustrative case study of the findings from a community site to underscore the importance of attending to community differences when trying to engage in research to understand and improve educational attainment for all.Keywords: educational achievement, multiple deprivation, community case studies, social capital
Procedia PDF Downloads 3871032 Effects of Machining Parameters on the Surface Roughness and Vibration of the Milling Tool
Authors: Yung C. Lin, Kung D. Wu, Wei C. Shih, Jui P. Hung
Abstract:
High speed and high precision machining have become the most important technology in manufacturing industry. The surface roughness of high precision components is regarded as the important characteristics of the product quality. However, machining chatter could damage the machined surface and restricts the process efficiency. Therefore, selection of the appropriate cutting conditions is of importance to prevent the occurrence of chatter. In addition, vibration of the spindle tool also affects the surface quality, which implies the surface precision can be controlled by monitoring the vibration of the spindle tool. Based on this concept, this study was aimed to investigate the influence of the machining conditions on the surface roughness and the vibration of the spindle tool. To this end, a series of machining tests were conducted on aluminum alloy. In tests, the vibration of the spindle tool was measured by using the acceleration sensors. The surface roughness of the machined parts was examined using white light interferometer. The response surface methodology (RSM) was employed to establish the mathematical models for predicting surface finish and tool vibration, respectively. The correlation between the surface roughness and spindle tool vibration was also analyzed by ANOVA analysis. According to the machining tests, machined surface with or without chattering was marked on the lobes diagram as the verification of the machining conditions. Using multivariable regression analysis, the mathematical models for predicting the surface roughness and tool vibrations were developed based on the machining parameters, cutting depth (a), feed rate (f) and spindle speed (s). The predicted roughness is shown to agree well with the measured roughness, an average percentage of errors of 10%. The average percentage of errors of the tool vibrations between the measurements and the predictions of mathematical model is about 7.39%. In addition, the tool vibration under various machining conditions has been found to have a positive influence on the surface roughness (r=0.78). As a conclusion from current results, the mathematical models were successfully developed for the predictions of the surface roughness and vibration level of the spindle tool under different cutting condition, which can help to select appropriate cutting parameters and to monitor the machining conditions to achieve high surface quality in milling operation.Keywords: machining parameters, machining stability, regression analysis, surface roughness
Procedia PDF Downloads 2301031 Optimizing Stormwater Sampling Design for Estimation of Pollutant Loads
Authors: Raja Umer Sajjad, Chang Hee Lee
Abstract:
Stormwater runoff is the leading contributor to pollution of receiving waters. In response, an efficient stormwater monitoring program is required to quantify and eventually reduce stormwater pollution. The overall goals of stormwater monitoring programs primarily include the identification of high-risk dischargers and the development of total maximum daily loads (TMDLs). The challenge in developing better monitoring program is to reduce the variability in flux estimates due to sampling errors; however, the success of monitoring program mainly depends on the accuracy of the estimates. Apart from sampling errors, manpower and budgetary constraints also influence the quality of the estimates. This study attempted to develop optimum stormwater monitoring design considering both cost and the quality of the estimated pollutants flux. Three years stormwater monitoring data (2012 – 2014) from a mix land use located within Geumhak watershed South Korea was evaluated. The regional climate is humid and precipitation is usually well distributed through the year. The investigation of a large number of water quality parameters is time-consuming and resource intensive. In order to identify a suite of easy-to-measure parameters to act as a surrogate, Principal Component Analysis (PCA) was applied. Means, standard deviations, coefficient of variation (CV) and other simple statistics were performed using multivariate statistical analysis software SPSS 22.0. The implication of sampling time on monitoring results, number of samples required during the storm event and impact of seasonal first flush were also identified. Based on the observations derived from the PCA biplot and the correlation matrix, total suspended solids (TSS) was identified as a potential surrogate for turbidity, total phosphorus and for heavy metals like lead, chromium, and copper whereas, Chemical Oxygen Demand (COD) was identified as surrogate for organic matter. The CV among different monitored water quality parameters were found higher (ranged from 3.8 to 15.5). It suggests that use of grab sampling design to estimate the mass emission rates in the study area can lead to errors due to large variability. TSS discharge load calculation error was found only 2 % with two different sample size approaches; i.e. 17 samples per storm event and equally distributed 6 samples per storm event. Both seasonal first flush and event first flush phenomena for most water quality parameters were observed in the study area. Samples taken at the initial stage of storm event generally overestimate the mass emissions; however, it was found that collecting a grab sample after initial hour of storm event more closely approximates the mean concentration of the event. It was concluded that site and regional climate specific interventions can be made to optimize the stormwater monitoring program in order to make it more effective and economical.Keywords: first flush, pollutant load, stormwater monitoring, surrogate parameters
Procedia PDF Downloads 2391030 The Comparative Effect of Neuro-Linguistic Programming (NLP), Critical Thinking and a Combination of Both On EFL Learners' Reading Comprehension
Authors: Mona Khabiri, Fahimeh Farahani
Abstract:
The present study was an attempt to investigate the comparative effect of teaching NLP, critical thinking, and a combination of NLP and critical thinking on EFL learners' reading comprehension. To fulfill the purpose of this study, a group of 82 female and male intermediate EFL learners at a Language School in Iran took a piloted sample PET as a proficiency test and 63 of them were selected as homogenous learners and were randomly assigned to three experimental groups. Within a treatment process of 10 sessions the teacher/researcher provided the participants of each group with handouts, explanations, practices, homework, and questionnaires on techniques of NLP, critical thinking, and a combination of both. During these 10 sessions, 10 same reading comprehension texts extracted from the multi-skill course book suggested by the language school where thought to the participants of each experimental group using skills and strategies of NLP, critical thinking, and a combination of both. On the eleventh session, the participants sat for a reading posttest. The results of one-way ANOVA showed no significant difference among the three groups in terms of reading comprehension. Justifications and implications for the findings of the study and suggestions for further research are presented.Keywords: neuro-linguistic programming (NLP), critical thinking, reading comprehension
Procedia PDF Downloads 4101029 Optimization of Process Parameters for Peroxidase Production by Ensifer Species
Authors: Ayodeji O. Falade, Leonard V. Mabinya, Uchechukwu U. Nwodo, Anthony I. Okoh
Abstract:
Given the high utility of peroxidase in several industrial processes, the search for novel microorganisms with enhanced peroxidase production capacity is of keen interest. This study investigated the process conditions for optimum peroxidase production by Ensifer sp, new ligninolytic proteobacteria with peroxidase production potential. Also, some agricultural residues were valorized for peroxidase production under solid state fermentation. Peroxidase production was optimum at an initial medium pH 7, incubation temperature of 30 °C and agitation speed of 100 rpm using alkali lignin fermentation medium supplemented with guaiacol as the most effective inducer and ammonium sulphate as the best inorganic nitrogen. Optimum peroxidase production by Ensifer sp. was attained at 48 h with specific productivity of 12.76 ± 1.09 U mg⁻¹. Interestingly, probable laccase production was observed with optimum specific productivity of 12.76 ± 0.45 U mg⁻¹ at 72 h. The highest peroxidase yield was observed with sawdust as solid substrate under solid state fermentation. In conclusion, Ensifer sp. possesses the capacity for enhanced peroxidase production that can be exploited for various biotechnological applications.Keywords: catalase-peroxidase, enzyme production, peroxidase, polymerase chain reaction, proteobacteria
Procedia PDF Downloads 3051028 Availability Analysis of Milling System in a Rice Milling Plant
Authors: P. C. Tewari, Parveen Kumar
Abstract:
The paper describes the availability analysis of milling system of a rice milling plant using probabilistic approach. The subsystems under study are special purpose machines. The availability analysis of the system is carried out to determine the effect of failure and repair rates of each subsystem on overall performance (i.e. steady state availability) of system concerned. Further, on the basis of effect of repair rates on the system availability, maintenance repair priorities have been suggested. The problem is formulated using Markov Birth-Death process taking exponential distribution for probable failures and repair rates. The first order differential equations associated with transition diagram are developed by using mnemonic rule. These equations are solved using normalizing conditions and recursive method to drive out the steady state availability expression of the system. The findings of the paper are presented and discussed with the plant personnel to adopt a suitable maintenance policy to increase the productivity of the rice milling plant.Keywords: availability modeling, Markov process, milling system, rice milling plant
Procedia PDF Downloads 2331027 Identification and Characterization of Small Peptides Encoded by Small Open Reading Frames using Mass Spectrometry and Bioinformatics
Authors: Su Mon Saw, Joe Rothnagel
Abstract:
Short open reading frames (sORFs) located in 5’UTR of mRNAs are known as uORFs. Characterization of uORF-encoded peptides (uPEPs) i.e., a subset of short open reading frame encoded peptides (sPEPs) and their translation regulation lead to understanding of causes of genetic disease, proteome complexity and development of treatments. Existence of uORFs within cellular proteome could be detected by LC-MS/MS. The ability of uORF to be translated into uPEP and achievement of uPEP identification will allow uPEP’s characterization, structures, functions, subcellular localization, evolutionary maintenance (conservation in human and other species) and abundance in cells. It is hypothesized that a subset of sORFs are translatable and that their encoded sPEPs are functional and are endogenously expressed contributing to the eukaryotic cellular proteome complexity. This project aimed to investigate whether sORFs encode functional peptides. Liquid chromatography-mass spectrometry (LC-MS) and bioinformatics were thus employed. Due to probable low abundance of sPEPs and small in sizes, the need for efficient peptide enrichment strategies for enriching small proteins and depleting the sub-proteome of large and abundant proteins is crucial for identifying sPEPs. Low molecular weight proteins were extracted using SDS-PAGE from Human Embryonic Kidney (HEK293) cells and Strong Cation Exchange Chromatography (SCX) from secreted HEK293 cells. Extracted proteins were digested by trypsin to peptides, which were detected by LC-MS/MS. The MS/MS data obtained was searched against Swiss-Prot using MASCOT version 2.4 to filter out known proteins, and all unmatched spectra were re-searched against human RefSeq database. ProteinPilot v5.0.1 was used to identify sPEPs by searching against human RefSeq, Vanderperre and Human Alternative Open Reading Frame (HaltORF) databases. Potential sPEPs were analyzed by bioinformatics. Since SDS PAGE electrophoresis could not separate proteins <20kDa, this could not identify sPEPs. All MASCOT-identified peptide fragments were parts of main open reading frame (mORF) by ORF Finder search and blastp search. No sPEP was detected and existence of sPEPs could not be identified in this study. 13 translated sORFs in HEK293 cells by mass spectrometry in previous studies were characterized by bioinformatics. Identified sPEPs from previous studies were <100 amino acids and <15 kDa. Bioinformatics results showed that sORFs are translated to sPEPs and contribute to proteome complexity. uPEP translated from uORF of SLC35A4 was strongly conserved in human and mouse while uPEP translated from uORF of MKKS was strongly conserved in human and Rhesus monkey. Cross-species conserved uORFs in association with protein translation strongly suggest evolutionary maintenance of coding sequence and indicate probable functional expression of peptides encoded within these uORFs. Translation of sORFs was confirmed by mass spectrometry and sPEPs were characterized with bioinformatics.Keywords: bioinformatics, HEK293 cells, liquid chromatography-mass spectrometry, ProteinPilot, Strong Cation Exchange Chromatography, SDS-PAGE, sPEPs
Procedia PDF Downloads 1861026 Analysis of Methodological Issues in the Study of Digital Library Services: A Case Study of Nigeria University Systems
Authors: Abdulmumin Isah
Abstract:
Over the years, researchers have employed different approaches in the study of usage of library services in the traditional library system, such approaches have provided explanations on the users’ perception, attitude, and usage of library services. Findings of such studies which often employed survey research approach have guided librarians and library stakeholders in their drive to improve library services to patrons. However, with the advent of digital library services, librarians and information science researchers have been experiencing methodological issues in the study of digital library services. While some quantitative approaches have been employed to understand adoption and usage of digital library services, conflicting results from such studies have increased the need to employ qualitative approaches. The appropriateness of the qualitative approaches has also been questioned. This study intends to review methodological approaches in the studies of digital libraries and provides a framework for the selection of appropriate research approach for the study of digital libraries using Nigerian university systems as case study.Keywords: digital library, university library, methodological issues, research approaches, quantitative, qualitative, Nigeria
Procedia PDF Downloads 5231025 Social Work Education in Gujarat: Challenges and Responses
Authors: Rajeshkumar Mahendrabhai Patel, Narendrakumar D. Vasava
Abstract:
It is seen that higher education in India requires a high degree of attention for the quality. The Government of India has been putting its efforts to improvise the quality of higher education through different means such as need based changes in the policy of higher education, accreditation of the institutions of higher education and many others. The Social Work education in India started way back in Tata School of Social Sciences in the year 1936. Gradually the need for social work education was felt, and different institution started imparting social work education in different regions. Due to the poor educational policy of Gujarat state (The Concept of Self-Financed Education) different Universities initiated the MSW program on a self-financed basis. The present scenario of the Social work Education in Gujarat faces ample challenges and problems which need to be addressed consciously. The present paper will try to examine and analyze the challenges and problems such as curriculum, staffing, quality of teaching, the pattern of education etc. The probable responses to this scenario are also discussed in this paper.Keywords: social work education, challenges, problems, responses, self-financed education in Gujarat
Procedia PDF Downloads 3661024 Guided Information Campaigns for Counter-Terrorism: Behavioral Approach to Interventions Regarding Polarized Societal Network
Authors: Joshua Midha
Abstract:
The basis for information campaigns and behavioral interventions has long reigned as a tactic. From the Soviet-era propaganda machines to the opinion hijacks in Iran, these measures are now commonplace and are used for dissemination and disassembly. However, the use of these tools for strategic diffusion, specifically in a counter-terrorism setting, has only been explored on the surface. This paper aims to introduce a larger conceptual portion of guided information campaigns into preexisting terror cells and situations. It provides an alternative, low-risk intervention platform for future military strategy. This paper highlights a theoretical framework to lay out the foundationary details and explanations for behavioral interventions and moves into using a case study to highlight the possibility of implementation. It details strategies, resources, circumstances, and risk factors for intervention. It also sets an expanding foundation for offensive PsyOps and argues for tactical diffusion of information to battle extremist sentiment. The two larger frameworks touch on the internal spread of information within terror cells and external political sway, thus charting a larger holistic purpose of strategic operations.Keywords: terrorism, behavioral intervention, propaganda, SNA, extremism
Procedia PDF Downloads 941023 Moral Dilemmas, Difficulties in the Digital Games
Authors: YuPei Chang
Abstract:
In recent years, moral judgement tasks have served as an increasingly popular plot mechanism in digital gameplay. As a moral agency, the player's choice judgment in digital games is to shuttle between the real world and the game world. The purpose of the research is to explore the moral difficulties brewed by the interactive mechanism of the game and the moral choice of players. In the theoretical level, this research tries to combine moral disengagement, moral foundations theory, and gameplay as an aesthetic experience. And in the methodical level, this research tries to use methods that combine text analysis, diary method, and in-depth interviews. There are three research problems that will be solved in three stages. In the first stage, this project will explore how moral dilemmas are represented in game mechanics. In the second stage, this project will analyze the appearance and conflicts of moral dilemmas in game mechanics based on the five aspects of moral foundations theory. In the third stage, this project will try to understand the players' choices when they face the choices of moral dilemmas, as well as their explanations and reflections after making the decisions.Keywords: morality, moral disengagement, moral foundations theory, PC game, gameplay, moral dilemmas, player
Procedia PDF Downloads 781022 Reminiscence Therapy for Alzheimer’s Disease Restrained on Logistic Regression Based Linear Bootstrap Aggregating
Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Xianpei Li, Yanmin Yuan, Tracy Lin Huan
Abstract:
Researchers are doing enchanting research into the inherited features of Alzheimer’s disease and probable consistent therapies. In Alzheimer’s, memories are extinct in reverse order; memories formed lately are more transitory than those from formerly. Reminiscence therapy includes the conversation of past actions, trials and knowledges with another individual or set of people, frequently with the help of perceptible reminders such as photos, household and other acquainted matters from the past, music and collection of tapes. In this manuscript, the competence of reminiscence therapy for Alzheimer’s disease is measured using logistic regression based linear bootstrap aggregating. Logistic regression is used to envisage the experiential features of the patient’s memory through various therapies. Linear bootstrap aggregating shows better stability and accuracy of reminiscence therapy used in statistical classification and regression of memories related to validation therapy, supportive psychotherapy, sensory integration and simulated presence therapy.Keywords: Alzheimer’s disease, linear bootstrap aggregating, logistic regression, reminiscence therapy
Procedia PDF Downloads 3071021 Understanding Post-Displacement Earnings Losses: The Role of Wealth Inequality
Authors: M. Bartal
Abstract:
A large empirical evidence points to sizable lifetime earnings losses associated with the displacement of tenured workers. The causes of these losses are still not well-understood. Existing explanations are heavily based on human capital depreciation during non-employment spells. In this paper, a new avenue is explored. Evidence on the role of household liquidity constraints in accounting for the persistence of post-displacement earning losses is provided based on SIPP data. Then, a directed search and matching model with endogenous human capital and wealth accumulation is introduced. The model is computationally tractable thanks to its block-recursive structure and highlights a non-trivial, yet intuitive, interaction between wealth and human capital. Constrained workers tend to accept jobs with low firm-sponsored training because the latter are (endogenously) easier to find. This new channel provides a plausible explanation for why young (highly constrained) workers suffer persistent scars after displacement. Finally, the model is calibrated on US data to show that the interplay between wealth and human capital is crucial to replicate the observed lifecycle pattern of earning losses. JEL— E21, E24, J24, J63.Keywords: directed search, human capital accumulation, job displacement, wealth accumulation
Procedia PDF Downloads 2061020 Good Banks, Bad Banks, and Public Scrutiny: The Determinants of Corporate Social Responsibility in Times of Financial Volatility
Authors: A. W. Chalmers, O. M. van den Broek
Abstract:
This article examines the relationship between the global financial crisis and corporate social responsibility activities of financial services firms. It challenges the general consensus in existing studies that firms, when faced with economic hardship, tend to jettison CSR commitments. Instead, and building on recent insights into the institutional determinants of CSR, it is argued that firms are constrained in their ability to abandon CSR by the extent to which they are subject to intense public scrutiny by regulators and the news media. This argument is tested in the context of the European sovereign debt crisis drawing on a unique dataset of 170 firms in 15 different countries over a six-year period. Controlling for a battery of alternative explanations and comparing financial service providers to firms operating in other economic sectors, results indicate considerable evidence supporting the main argument. Rather than abandoning CSR during times of economic hardship, financial industry firms ramp up their CSR commitments in order to manage their public image and foster public trust in light of intense public scrutiny.Keywords: corporate social responsibility (CSR), public scrutiny, global financial crisis, financial services firms
Procedia PDF Downloads 3041019 Structural and Ion Exchange Studies of Terpolymer Resin Derived from 4, 4'-Biphenol-4,4'-Oxydianiline-Formaldehyde
Authors: Pawan P. Kalbende, Anil B. Zade
Abstract:
A novel terpolymer resin has been synthesized by condensation polymerization reaction of 4,4’-biphenol and 4,4’-oxydianiline with formaldehyde in presence of 2M hydrochloric acid as catalyst. Composition of resin was determined on the basis of their elemental analysis and further characterized by UV-Visible, infra-red and nuclear magnetic resonance spectroscopy to confine the most probable structure of synthesized terpolymer. Newly synthesized terpolymer was proved to be a selective chelating ion-exchanger for certain metal ions and were studied for Fe3+, Cu2+, Ni2+, Co2+, Zn2+, Cd2+, Hg2+ and Pb2+ ions using their metal nitrate solutions. A batch equilibrium method was employed to study the selectivity of metal ions uptake involving the measurements of the distribution of a given metal ion between the terpolymer sample and a solution containing the metal ion. The study was carried out over a wide pH range, shaking time and in media of different electrolytes at different ionic strengths. Distribution ratios of metal ions were found to be increased by rising pH of the solutions. Hence, it can be used to recover certain metal ions from waste water for the purpose of purification of water and removal of iron from boiler water.Keywords: terpolymers, ion-exchangers, distribution ratio, metal ion uptake
Procedia PDF Downloads 2941018 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images
Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar
Abstract:
Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.Keywords: diabetic retinopathy, fundus, CHT, exudates, hemorrhages
Procedia PDF Downloads 2711017 Neural Network Based Decision Trees Using Machine Learning for Alzheimer's Diagnosis
Authors: P. S. Jagadeesh Kumar, Tracy Lin Huan, S. Meenakshi Sundaram
Abstract:
Alzheimer’s disease is one of the prevalent kind of ailment, expected for impudent reconciliation or an effectual therapy is to be accredited hitherto. Probable detonation of patients in the upcoming years, and consequently an enormous deal of apprehension in early discovery of the disorder, this will conceivably chaperon to enhanced healing outcomes. Complex impetuosity of the brain is an observant symbolic of the disease and a unique recognition of genetic sign of the disease. Machine learning alongside deep learning and decision tree reinforces the aptitude to absorb characteristics from multi-dimensional data’s and thus simplifies automatic classification of Alzheimer’s disease. Susceptible testing was prophesied and realized in training the prospect of Alzheimer’s disease classification built on machine learning advances. It was shrewd that the decision trees trained with deep neural network fashioned the excellent results parallel to related pattern classification.Keywords: Alzheimer's diagnosis, decision trees, deep neural network, machine learning, pattern classification
Procedia PDF Downloads 2951016 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines
Authors: Kamyar Tolouei, Ehsan Moosavi
Abstract:
In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization
Procedia PDF Downloads 1031015 Theory and Practice of Wavelets in Signal Processing
Authors: Jalal Karam
Abstract:
The methods of Fourier, Laplace, and Wavelet Transforms provide transfer functions and relationships between the input and the output signals in linear time invariant systems. This paper shows the equivalence among these three methods and in each case presenting an application of the appropriate (Fourier, Laplace or Wavelet) to the convolution theorem. In addition, it is shown that the same holds for a direct integration method. The Biorthogonal wavelets Bior3.5 and Bior3.9 are examined and the zeros distribution of their polynomials associated filters are located. This paper also presents the significance of utilizing wavelets as effective tools in processing speech signals for common multimedia applications in general, and for recognition and compression in particular. Theoretically and practically, wavelets have proved to be effective and competitive. The practical use of the Continuous Wavelet Transform (CWT) in processing and analysis of speech is then presented along with explanations of how the human ear can be thought of as a natural wavelet transformer of speech. This generates a variety of approaches for applying the (CWT) to many paradigms analysing speech, sound and music. For perception, the flexibility of implementation of this transform allows the construction of numerous scales and we include two of them. Results for speech recognition and speech compression are then included.Keywords: continuous wavelet transform, biorthogonal wavelets, speech perception, recognition and compression
Procedia PDF Downloads 4151014 Pragmatic Discourse Functions of Locative Enclitics: A Descriptive Study of Luganda Locative Enclitics
Authors: Moureen Nanteza
Abstract:
This paper examines the pragmatic inferences of locative enclitics in Luganda (JE 15). Locative enclitics are words which cannot stand alone but are attached to a verb to make meaning. Their status is ambiguous between free word and affix, hence motivating their analysis as enclitics. The enclitics are attached on the post-final position of their hosts. Although the locative enclitics occur regularly in some Bantu languages (Luganda, Runyankore-Rukiga, Runyoro-Rutooro, Lunda, Ikizu, Fwe, Chichewa, Kinyarwanda among others), they have not been widely studied in the literature. The paper looks at verbal locative enclitics only but the locative enclitics also appear in other word categories in Luganda. This study is descriptive, with a qualitative approach. The data used in this study was collected through reviewing documents in Luganda - novels and plays and also the spoken discourses. In this study, the enclitic in Luganda serves many non-locative discourse-pragmatic functions which include showing urgency, politeness, showing the idea of ‘instead of’ and also emphasis. It has also been observed that enclitics are widely used in the urban youth languages (‘Luyaaye’) but this was not the focus of the current study. The results from the study offer explanations of key areas of syntax, morphology, and pragmatics relating to the form and functions of locative enclitics and the whole system of locative marking in Luganda and other Bantu languages.Keywords: Bantu, locative enclitics, Luganda, pragmatic inferences
Procedia PDF Downloads 1441013 Risk Screening in Digital Insurance Distribution: Evidence and Explanations
Authors: Finbarr Murphy, Wei Xu, Xian Xu
Abstract:
The embedding of digital technologies in the global economy has attracted increasing attention from economists. With a large and detailed dataset, this study examines the specific case where consumers have a choice between offline and digital channels in the context of insurance purchases. We find that digital channels screen consumers with lower unobserved risk. For the term life, endowment, and disease insurance products, the average risk of the policies purchased through digital channels was 75%, 21%, and 31%, respectively, lower than those purchased offline. As a consequence, the lower unobserved risk leads to weaker information asymmetry and higher profitability of digital channels. We highlight three mechanisms of the risk screening effect: heterogeneous marginal influence of channel features on insurance demand, the channel features directly related to risk control, and the link between the digital divide and risk. We also find that the risk screening effect mainly comes from the extensive margin, i.e., from new consumers. This paper contributes to three connected areas in the insurance context: the heterogeneous economic impacts of digital technology adoption, insurer-side risk selection, and insurance marketing.Keywords: digital economy, information asymmetry, insurance, mobile application, risk screening
Procedia PDF Downloads 711012 Rolling Contact Fatigue Failure Analysis of Ball Bearing in Gear Box
Authors: Piyas Palit, Urbi Pal, Jitendra Mathur, Santanu Das
Abstract:
Bearing is an important machinery part in the industry. When bearings fail to meet their expected life the consequences are increased downtime, loss of revenue and missed the delivery. This article describes the failure of a gearbox bearing in rolling contact fatigue. The investigation consists of visual observation, chemical analysis, characterization of microstructures using optical microscopes and hardness test. The present study also considers bearing life as well as the operational condition of bearings. Surface-initiated rolling contact fatigue, leading to a surface failure known as pitting, is a life-limiting failure mode in many modern machine elements, particularly rolling element bearings. Metallography analysis of crack propagation, crack morphology was also described. Indication of fatigue spalling in the ferrography test was also discussed. The analysis suggested the probable reasons for such kind of failure in operation. This type of spalling occurred due to (1) heavier external loading condition or (2) exceeds its service life.Keywords: bearing, rolling contact fatigue, bearing life
Procedia PDF Downloads 1691011 Overview and Post Damage Analysis of Nepal Earthquake 2015
Authors: Vipin Kumar Singhal, Rohit Kumar Mittal, Pavitra Ranjan Maiti
Abstract:
Damage analysis is one of the preliminary activities to be done after an earthquake so as to enhance the seismic building design technologies and prevent similar type of failure in future during earthquakes. This research article investigates the damage pattern and most probable reason of failure by observing photographs of seven major buildings collapsed/damaged which were evenly spread over the region during Mw7.8, Nepal earthquake 2015 followed by more than 400 aftershocks of Mw4 with one aftershock reaching a magnitude of Mw7.3. Over 250,000 buildings got damaged, and more than 9000 people got injured in this earthquake. Photographs of these buildings were collected after the earthquake and the cause of failure was estimated along with the severity of damage and comment on the reparability of structure has been made. Based on observations, it was concluded that the damage in reinforced concrete buildings was less compared to masonry structures. The number of buildings damaged was high near Kathmandu region due to high building density in that region. This type of damage analysis can be used as a cost effective and quick method for damage assessment during earthquakes.Keywords: Nepal earthquake, damage analysis, damage assessment, damage scales
Procedia PDF Downloads 3731010 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data
Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin
Abstract:
The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline
Procedia PDF Downloads 307