Search results for: machine learning approach for neurological disorder assessment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24562

Search results for: machine learning approach for neurological disorder assessment

15862 Compared Psychophysiological Responses under Stress in Patients of Chronic Fatigue Syndrome and Depressive Disorder

Authors: Fu-Chien Hung, Chi‐Wen Liang

Abstract:

Background: People who suffer from chronic fatigue syndrome (CFS) frequently complain about continuous tiredness, weakness or lack of strength, but without apparent organic etiology. The prevalence rate of the CFS is nearly from 3% to 20%, yet more than 80% go undiagnosed or misdiagnosed as depression. The biopsychosocial model has suggested the associations among the CFS, depressive syndrome, and stress. This study aimed to investigate the difference between individuals with the CFS and with the depressive syndrome on psychophysiological responses under stress. Method: There were 23 participants in the CFS group, 14 participants in the depression group, and 23 participants in the healthy control group. All of the participants first completed the measures of demographic data, CFS-related symptoms, daily life functioning, and depressive symptoms. The participants were then asked to perform a stressful cognitive task. The participants’ psychophysiological responses including the HR, BVP and SC were measured during the task. These indexes were used to assess the reactivity and recovery rates of the automatic nervous system. Results: The stress reactivity of the CFS and depression groups was not different from that of the healthy control group. However, the stress recovery rate of the CFS group was worse than that of the healthy control group. Conclusion: The results from this study suggest that the CFS is a syndrome which can be independent from the depressive syndrome, although the depressive syndrome may include fatigue syndrome.

Keywords: chronic fatigue syndrome, depression, stress response, misdiagnosis

Procedia PDF Downloads 445
15861 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons

Authors: Ozgu Hafizoglu

Abstract:

Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.

Keywords: analogy, analogical reasoning, cognitive model, brain and glials

Procedia PDF Downloads 171
15860 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System

Procedia PDF Downloads 538
15859 FE Modelling of Structural Effects of Alkali-Silica Reaction in Reinforced Concrete Beams

Authors: Mehdi Habibagahi, Shami Nejadi, Ata Aminfar

Abstract:

A significant degradation factor that impacts the durability of concrete structures is the alkali-silica reaction. Engineers are frequently charged with the challenges of conducting a thorough safety assessment of concrete structures that have been impacted by ASR. The alkali-silica reaction has a major influence on the structural capacities of structures. In most cases, the reduction in compressive strength, tensile strength, and modulus of elasticity is expressed as a function of free expansion and crack widths. Predicting the effect of ASR on flexural strength is also relevant. In this paper, a nonlinear three-dimensional (3D) finite-element model was proposed to describe the flexural strength degradation induced byASR.Initial strains, initial stresses, initial cracks, and deterioration of material characteristics were all considered ASR factors in this model. The effects of ASR on structural performance were evaluated by focusing on initial flexural stiffness, force–deformation curve, and load-carrying capacity. Degradation of concrete mechanical properties was correlated with ASR growth using material test data conducted at Tech Lab, UTS, and implemented into the FEM for various expansions. The finite element study revealed a better understanding of the ASR-affected RC beam's failure mechanism and capacity reduction as a function of ASR expansion. Furthermore, in this study, decreasing of the residual mechanical properties due to ASRisreviewed, using as input data for the FEM model. Finally, analysis techniques and a comparison of the analysis and the experiment results are discussed. Verification is also provided through analyses of reinforced concrete beams with behavior governed by either flexural or shear mechanisms.

Keywords: alkali-silica reaction, analysis, assessment, finite element, nonlinear analysis, reinforced concrete

Procedia PDF Downloads 150
15858 An Algorithm Based on the Nonlinear Filter Generator for Speech Encryption

Authors: A. Belmeguenai, K. Mansouri, R. Djemili

Abstract:

This work present a new algorithm based on the nonlinear filter generator for speech encryption and decryption. The proposed algorithm consists on the use a linear feedback shift register (LFSR) whose polynomial is primitive and nonlinear Boolean function. The purpose of this system is to construct Keystream with good statistical properties, but also easily computable on a machine with limited capacity calculated. This proposed speech encryption scheme is very simple, highly efficient, and fast to implement the speech encryption and decryption. We conclude the paper by showing that this system can resist certain known attacks.

Keywords: nonlinear filter generator, stream ciphers, speech encryption, security analysis

Procedia PDF Downloads 279
15857 Technology Enriched Classroom for Intercultural Competence Building through Films

Authors: Tamara Matevosyan

Abstract:

In this globalized world, intercultural communication is becoming essential for understanding communication among people, for developing understanding of cultures, to appreciate the opportunities and challenges that each culture presents to people. Moreover, it plays an important role in developing an ideal personification to understand different behaviors in different cultures. Native speakers assimilate sociolinguistic knowledge in natural conditions, while it is a great problem for language learners, and in this context feature films reveal cultural peculiarities and involve students in real communication. As we know nowadays the key role of language learning is the development of intercultural competence as communicating with someone from a different cultural background can be exciting and scary, frustrating and enlightening. Intercultural competence is important in FL learning classroom and here feature films can perform as essential tools to develop this competence and overcome the intercultural gap that foreign students face. Current proposal attempts to reveal the correlation of the given culture and language through feature films. To ensure qualified, well-organized and practical classes on Intercultural Communication for language learners a number of methods connected with movie watching have been implemented. All the pre-watching, while watching and post-watching methods and techniques are aimed at developing students’ communicative competence. The application of such activities as Climax, Role-play, Interactive Language, Daily Life helps to reveal and overcome mistakes of cultural and pragmatic character. All the above-mentioned activities are directed at the assimilation of the language vocabulary with special reference to the given culture. The study dwells into the essence of culture as one of the core concepts of intercultural communication. Sometimes culture is not a priority in the process of language learning which leads to further misunderstandings in real life communication. The application of various methods and techniques with feature films aims at developing students’ cultural competence, their understanding of norms and values of individual cultures. Thus, feature film activities will enable learners to enlarge their knowledge of the particular culture and develop a fundamental insight into intercultural communication.

Keywords: climax, intercultural competence, interactive language, role-play

Procedia PDF Downloads 331
15856 Conscious Intention-based Processes Impact the Neural Activities Prior to Voluntary Action on Reinforcement Learning Schedules

Authors: Xiaosheng Chen, Jingjing Chen, Phil Reed, Dan Zhang

Abstract:

Conscious intention can be a promising point cut to grasp consciousness and orient voluntary action. The current study adopted a random ratio (RR), yoked random interval (RI) reinforcement learning schedule instead of the previous highly repeatable and single decision point paradigms, aimed to induce voluntary action with the conscious intention that evolves from the interaction between short-range-intention and long-range-intention. Readiness potential (RP) -like-EEG amplitude and inter-trial-EEG variability decreased significantly prior to voluntary action compared to cued action for inter-trial-EEG variability, mainly featured during the earlier stage of neural activities. Notably, (RP) -like-EEG amplitudes decreased significantly prior to higher RI-reward rates responses in which participants formed a higher plane of conscious intention. The present study suggests the possible contribution of conscious intention-based processes to the neural activities from the earlier stage prior to voluntary action on reinforcement leanring schedule.

Keywords: Reinforcement leaning schedule, voluntary action, EEG, conscious intention, readiness potential

Procedia PDF Downloads 62
15855 The Digital Transformation of Life Insurance Sales in Iran With the Emergence of Personal Financial Planning Robots; Opportunities and Challenges

Authors: Pedram Saadati, Zahra Nazari

Abstract:

Anticipating and identifying future opportunities and challenges facing industry activists for the emergence and entry of new knowledge and technologies of personal financial planning, and providing practical solutions is one of the goals of this research. For this purpose, a future research tool based on receiving opinions from the main players of the insurance industry has been used. The research method in this study was in 4 stages; including 1- a survey of the specialist salesforce of life insurance in order to identify the variables 2- the ranking of the variables by experts selected by a researcher-made questionnaire 3- holding a panel of experts with the aim of understanding the mutual effects of the variables and 4- statistical analyzes of the mutual effects matrix in Mick Mac software is done. The integrated analysis of influencing variables in the future has been done with the method of Structural Analysis, which is one of the efficient and innovative methods of future research. A list of opportunities and challenges was identified through a survey of best-selling life insurance representatives who were selected by snowball sampling. In order to prioritize and identify the most important issues, all the issues raised were sent to selected experts who were selected theoretically through a researcher-made questionnaire. The respondents determined the importance of 36 variables through scoring, so that the prioritization of opportunity and challenge variables can be determined. 8 of the variables identified in the first stage were removed by selected experts, and finally, the number of variables that could be examined in the third stage became 28 variables, which, in order to facilitate the examination, were divided into 6 categories, respectively, 11 variables of organization and management. Marketing and sales 7 cases, social and cultural 6 cases, technological 2 cases, rebranding 1 case and insurance 1 case were divided. The reliability of the researcher-made questionnaire was confirmed with the Cronbach's alpha test value of 0.96. In the third stage, by forming a panel consisting of 5 insurance industry experts, the consensus of their opinions about the influence of factors on each other and the ranking of variables was entered into the matrix. The matrix included the interrelationships of 28 variables, which were investigated using the structural analysis method. By analyzing the data obtained from the matrix by Mic Mac software, the findings of the research indicate that the categories of "correct training in the use of the software, the weakness of the technology of insurance companies in personalizing products, using the approach of equipping the customer, and honesty in declaring no need Customer to Insurance", the most important challenges of the influencer and the categories of "salesforce equipping approach, product personalization based on customer needs assessment, customer's pleasant experience of being consulted with consulting robots, business improvement of the insurance company due to the use of these tools, increasing the efficiency of the issuance process and optimal customer purchase" were identified as the most important opportunities for influence.

Keywords: personal financial planning, wealth management, advisor robots, life insurance, digital transformation

Procedia PDF Downloads 33
15854 Quantification of Site Nonlinearity Based on HHT Analysis of Seismic Recordings

Authors: Ruichong Zhang

Abstract:

This study proposes a recording-based approach to characterize and quantify earthquake-induced site nonlinearity, exemplified as soil nonlinearity and/or liquefaction. Alternative to Fourier spectral analysis (FSA), the paper introduces time-frequency analysis of earthquake ground motion recordings with the aid of so-called Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the nonlinear features shown in the recordings. With the use of the 2001 Nisqually earthquake recordings, this study shows that the proposed approach is effective in characterizing site nonlinearity and quantifying the influences in seismic ground responses.

Keywords: site nonlinearity, site amplification, site damping, Hilbert-Huang Transform (HHT), liquefaction, 2001 Nisqually Earthquake

Procedia PDF Downloads 472
15853 Impact Assessment of Climate Change on Water Resources in the Kabul River Basin

Authors: Tayib Bromand, Keisuke Sato

Abstract:

This paper presents the introduction to current water balance and climate change assessment in the Kabul river basin. The historical and future impacts of climate change on different components of water resources and hydrology in the Kabul river basin. The eastern part of Afghanistan, the Kabul river basin was chosen due to rapid population growth and land degradation to quantify the potential influence of Gobal Climate Change on its hydrodynamic characteristics. Luck of observed meteorological data was the main limitation of present research, few existed precipitation stations in the plain area of Kabul basin selected to compare with TRMM precipitation records, the result has been evaluated satisfactory based on regression and normal ratio methods. So the TRMM daily precipitation and NCEP temperature data set applied in the SWAT model to evaluate water balance for 2008 to 2012. Middle of the twenty – first century (2064) selected as the target period to assess impacts of climate change on hydrology aspects in the Kabul river basin. For this purpose three emission scenarios, A2, A1B and B1 and four GCMs, such as MIROC 3.2 (Med), CGCM 3.1 (T47), GFDL-CM2.0 and CNRM-CM3 have been selected, to estimate the future initial conditions of the proposed model. The outputs of the model compared and calibrated based on (R2) satisfactory. The assessed hydrodynamic characteristics and precipitation pattern. The results show that there will be significant impacts on precipitation patter such as decreasing of snowfall in the mountainous area of the basin in the Winter season due to increasing of 2.9°C mean annual temperature and land degradation due to deforestation.

Keywords: climate change, emission scenarios, hydrological components, Kabul river basin, SWAT model

Procedia PDF Downloads 448
15852 Looking beyond Corporate Social Responsibility to Sustainable Development: Conceptualisation and Theoretical Exploration

Authors: Mercy E. Makpor

Abstract:

Traditional Corporate Social Responsibility (CSR) idea has gone beyond just ensuring safety environments, caring about global warming and ensuring good living standards and conditions for the society at large. The paradigm shift is towards a focus on strategic objectives and the long-term value creation for both businesses and the society at large for a realistic future. As an important approach to solving social and environment issues, CSR has been accepted globally. Yet the approach is expected to go beyond where it is currently. So much is expected from businesses and governments at every level globally and locally. This then leads to the original idea of the concept, that is, how it originated and how it has been perceived over the years. Little wonder there has been a lot of definitions surrounding the concept without a major globally acceptable definition of it. The definition of CSR given by the European Commission will be considered for the purpose of this paper. Sustainable Development (SD), on the other hand, has been viewed in recent years as an ethical concept explained in the UN-Report termed “Our Common Future,” which can also be referred to as the Brundtland report. The report summarises the need for SD to take place in the present without comprising the future. However, the recent 21st-century framework on sustainability known as the “Triple Bottom Line (TBL)” framework, has added its voice to the concepts of CSR and sustainable development. The TBL model is of the opinion that businesses should not only report on their financial performance but also on their social and environmental performances, highlighting that CSR has gone beyond just the “material-impact” approach towards a “Future-Oriented” approach (sustainability). In this paper, the concept of CSR is revisited by exploring the various theories therein. The discourse on the concepts of sustainable development and sustainable development frameworks will also be indicated, thereby inducing these into how CSR can benefit both businesses and their stakeholders as well as the entirety of the society, not just for the present but for the future. It does this by exploring the importance of both concepts (CSR and SD) and concludes by making recommendations for a more empirical research in the near future.

Keywords: corporate social responsibility, sustainable development, sustainability, triple bottom line model

Procedia PDF Downloads 234
15851 Considering Partially Developed Artifacts in Change Impact Analysis Implementation

Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim

Abstract:

It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.

Keywords: software development, impact analysis, traceability, static analysis.

Procedia PDF Downloads 594
15850 Integration of Technology in Business Education: Emerging Voices from Business Education Classrooms in Nigeria Secondary Schools

Authors: Clinton Chidiebere Anyanwu

Abstract:

Secondary education is a vital part of a virtuous circle of economic growth within the context of a globalised knowledge economy. The teaching of Business Education entails teaching learners the essentials, rudiments, assumptions, and methods of business. Hence, it was deemed necessary for the study to investigate technology integration in Business Education. Drawing from the theoretical frameworks of technological pedagogical content knowledge (TPACK), and unified theory of acceptance and use of technology (UTAUT), the study observes teachers’ level of technology use in Business Education classrooms. Using a mixed-methods sequential explanatory design, probability, and purposive sampling, the majority of participants were found to be not integrating technology to an acceptable level and a small percentage was. After an analysis of constructs from UTAUT, some of this could be attributed to the lack of facilitating conditions in the teaching and learning of Business Education. The implication of the study findings is that poor investment in technology integration in secondary schools in Nigeria affects pedagogical implementations and effective teaching and learning of Business Education subjects. The study concludes that if facilitating conditions and professional development are considered to address the shortfalls in terms of TPACK, technology integration will become a reality in secondary schools in Nigeria.

Keywords: business education, secondary education, technology integration, TPACK, UTAUT

Procedia PDF Downloads 194
15849 Radioactivity Assessment of Sediments in Negombo Lagoon Sri Lanka

Authors: H. M. N. L. Handagiripathira

Abstract:

The distributions of naturally occurring and anthropogenic radioactive materials were determined in surface sediments taken at 27 different locations along the bank of Negombo Lagoon in Sri Lanka. Hydrographic parameters of lagoon water and the grain size analyses of the sediment samples were also carried out for this study. The conductivity of the adjacent water was varied from 13.6 mS/cm to 55.4 mS/cm near to the southern end and the northern end of the lagoon, respectively, and equally salinity levels varied from 7.2 psu to 32.1 psu. The average pH in the water was 7.6 and average water temperature was 28.7 °C. The grain size analysis emphasized the mass fractions of the samples as sand (60.9%), fine sand (30.6%) and fine silt+clay (1.3%) in the sampling locations. The surface sediment samples of wet weight, 1 kg each from upper 5-10 cm layer, were oven dried at 105 °C for 24 hours to get a constant weight, homogenized and sieved through a 2 mm sieve (IAEA technical series no. 295). The radioactivity concentrations were determined using gamma spectrometry technique. Ultra Low Background Broad Energy High Purity Ge Detector, BEGe (Model BE5030, Canberra) was used for radioactivity measurement with Canberra Industries' Laboratory Source-less Calibration Software (LabSOCS) mathematical efficiency calibration approach and Geometry composer software. The mean activity concentration was found to be 24 ± 4, 67 ± 9, 181 ± 10, 59 ± 8, 3.5 ± 0.4 and 0.47 ± 0.08 Bq/kg for 238U, 232Th, 40K, 210Pb, 235U and 137Cs respectively. The mean absorbed dose rate in air, radium equivalent activity, external hazard index, annual gonadal dose equivalent and annual effective dose equivalent were 60.8 nGy/h, 137.3 Bq/kg, 0.4, 425.3 mSv/year and 74.6 mSv/year, respectively. The results of this study will provide baseline information on the natural and artificial radioactive isotopes and environmental pollution associated with information on radiological risk.

Keywords: gamma spectrometry, lagoon, radioactivity, sediments

Procedia PDF Downloads 126
15848 Neuron-Based Control Mechanisms for a Robotic Arm and Hand

Authors: Nishant Singh, Christian Huyck, Vaibhav Gandhi, Alexander Jones

Abstract:

A robotic arm and hand controlled by simulated neurons is presented. The robot makes use of a biological neuron simulator using a point neural model. The neurons and synapses are organised to create a finite state automaton including neural inputs from sensors, and outputs to effectors. The robot performs a simple pick-and-place task. This work is a proof of concept study for a longer term approach. It is hoped that further work will lead to more effective and flexible robots. As another benefit, it is hoped that further work will also lead to a better understanding of human and other animal neural processing, particularly for physical motion. This is a multidisciplinary approach combining cognitive neuroscience, robotics, and psychology.

Keywords: cell assembly, force sensitive resistor, robot, spiking neuron

Procedia PDF Downloads 337
15847 Clubhouse: A Minor Rebellion against the Algorithmic Tyranny of the Majority

Authors: Vahid Asadzadeh, Amin Ataee

Abstract:

Since the advent of social media, there has been a wave of optimism among researchers and civic activists about the influence of virtual networks on the democratization process, which has gradually waned. One of the lesser-known concerns is how to increase the possibility of hearing the voices of different minorities. According to the theory of media logic, the media, using their technological capabilities, act as a structure through which events and ideas are interpreted. Social media, through the use of the learning machine and the use of algorithms, has formed a kind of structure in which the voices of minorities and less popular topics are lost among the commotion of the trends. In fact, the recommended systems and algorithms used in social media are designed to help promote trends and make popular content more popular, and content that belongs to minorities is constantly marginalized. As social networks gradually play a more active role in politics, the possibility of freely participating in the reproduction and reinterpretation of structures in general and political structures in particular (as Laclau‎ and Mouffe had in mind‎) can be considered as criteria to democracy in action. The point is that the media logic of virtual networks is shaped by the rule and even the tyranny of the majority, and this logic does not make it possible to design a self-foundation and self-revolutionary model of democracy. In other words, today's social networks, though seemingly full of variety But they are governed by the logic of homogeneity, and they do not have the possibility of multiplicity as is the case in immanent radical democracies (influenced by Gilles Deleuze). However, with the emergence and increasing popularity of Clubhouse as a new social media, there seems to be a shift in the social media space, and that is the diminishing role of algorithms and systems reconditioners as content delivery interfaces. This has led to the fact that in the Clubhouse, the voices of minorities are better heard, and the diversity of political tendencies manifests itself better. The purpose of this article is to show, first, how social networks serve the elimination of minorities in general, and second, to argue that the media logic of social networks must adapt to new interpretations of democracy that give more space to minorities and human rights. Finally, this article will show how the Clubhouse serves the new interpretations of democracy at least in a minimal way. To achieve the mentioned goals, in this article by a descriptive-analytical method, first, the relation between media logic and postmodern democracy will be inquired. The political economy popularity in social media and its conflict with democracy will be discussed. Finally, it will be explored how the Clubhouse provides a new horizon for the concepts embodied in radical democracy, a horizon that more effectively serves the rights of minorities and human rights in general.

Keywords: algorithmic tyranny, Clubhouse, minority rights, radical democracy, social media

Procedia PDF Downloads 136
15846 Educational Institutional Approach for Livelihood Improvement and Sustainable Development

Authors: William Kerua

Abstract:

The PNG University of Technology (Unitech) has mandatory access to teaching, research and extension education. Given such function, the Agriculture Department has established the ‘South Pacific Institute of Sustainable Agriculture and Rural Development (SPISARD)’ in 2004. SPISARD is established as a vehicle to improve farming systems practiced in selected villages by undertaking pluralistic extension method through ‘Educational Institutional Approach’. Unlike other models, SPISARD’s educational institutional approach stresses on improving the whole farming systems practiced in a holistic manner and has a two-fold focus. The first is to understand the farming communities and improve the productivity of the farming systems in a sustainable way to increase income, improve nutrition and food security as well as livelihood enhancement trainings. The second is to enrich the Department’s curriculum through teaching, research, extension and getting inputs from farming community. SPISARD has established number of model villages in various provinces in Papua New Guinea (PNG) and with many positive outcome and success stories. Adaption of ‘educational institutional approach’ thus binds research, extension and training into one package with the use of students and academic staff through model village establishment in delivering development and extension to communities. This centre (SPISARD) coordinates the activities of the model village programs and linkages. The key to the development of the farming systems is establishing and coordinating linkages, collaboration, and developing partnerships both within and external institutions, organizations and agencies. SPISARD has a six-point step strategy for the development of sustainable agriculture and rural development. These steps are (i) establish contact and identify model villages, (ii) development of model village resource centres for research and trainings, (iii) conduct baseline surveys to identify problems/needs of model villages, (iv) development of solution strategies, (v) implementation and (vi) evaluation of impact of solution programs. SPISARD envisages that the farming systems practiced being improved if the villages can be made the centre of SPISARD activities. Therefore, SPISARD has developed a model village approach to channel rural development. The model village when established become the conduit points where teaching, training, research, and technology transfer takes place. This approach is again different and unique to the existing ones, in that, the development process take place in the farmers’ environment with immediate ‘real time’ feedback mechanisms based on the farmers’ perspective and satisfaction. So far, we have developed 14 model villages and have conducted 75 trainings in 21 different areas/topics in 8 provinces to a total of 2,832 participants of both sex. The aim of these trainings is to directly participate with farmers in the pursuit to improving their farming systems to increase productivity, income and to secure food security and nutrition, thus to improve their livelihood.

Keywords: development, educational institutional approach, livelihood improvement, sustainable agriculture

Procedia PDF Downloads 146
15845 Single Pole-To-Earth Fault Detection and Location on the Tehran Railway System Using ICA and PSO Trained Neural Network

Authors: Masoud Safarishaal

Abstract:

Detecting the location of pole-to-earth faults is essential for the safe operation of the electrical system of the railroad. This paper aims to use a combination of evolutionary algorithms and neural networks to increase the accuracy of single pole-to-earth fault detection and location on the Tehran railroad power supply system. As a result, the Imperialist Competitive Algorithm (ICA) and Particle Swarm Optimization (PSO) are used to train the neural network to improve the accuracy and convergence of the learning process. Due to the system's nonlinearity, fault detection is an ideal application for the proposed method, where the 600 Hz harmonic ripple method is used in this paper for fault detection. The substations were simulated by considering various situations in feeding the circuit, the transformer, and typical Tehran metro parameters that have developed the silicon rectifier. Required data for the network learning process has been gathered from simulation results. The 600Hz component value will change with the change of the location of a single pole to the earth's fault. Therefore, 600Hz components are used as inputs of the neural network when fault location is the output of the network system. The simulation results show that the proposed methods can accurately predict the fault location.

Keywords: single pole-to-pole fault, Tehran railway, ICA, PSO, artificial neural network

Procedia PDF Downloads 103
15844 Alternative General Formula to Estimate and Test Influences of Early Diagnosis on Cancer Survival

Authors: Li Yin, Xiaoqin Wang

Abstract:

Background and purpose: Cancer diagnosis is part of a complex stochastic process, in which patients' personal and social characteristics influence the choice of diagnosing methods, diagnosing methods, in turn, influence the initial assessment of cancer stage, the initial assessment, in turn, influences the choice of treating methods, and treating methods in turn influence cancer outcomes such as cancer survival. To evaluate diagnosing methods, one needs to estimate and test the causal effect of a regime of cancer diagnosis and treatments. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to estimate and test these causal effects via point effects. The purpose of the work is to estimate and test causal effects under various regimes of cancer diagnosis and treatments via point effects. Challenges and solutions: The cancer stage has influences from earlier diagnosis as well as on subsequent treatments. As a consequence, it is highly difficult to estimate and test the causal effects via standard parameters, that is, the conditional survival given all stationary covariates, diagnosing methods, cancer stage and prognosis factors, treating methods. Instead of standard parameters, we use the point effects of cancer diagnosis and treatments to estimate and test causal effects under various regimes of cancer diagnosis and treatments. We are able to use familiar methods in the framework of single-point causal inference to accomplish the task. Achievements: we have applied this method to stomach cancer survival from a clinical study in Sweden. We have studied causal effects under various regimes, including the optimal regime of diagnosis and treatments and the effect moderation of the causal effect by age and gender.

Keywords: cancer diagnosis, causal effect, point effect, G-formula, sequential causal effect

Procedia PDF Downloads 181
15843 “Everything, Everywhere, All at Once” Hollywoodization and Lack of Authenticity in Today’s Mainstream Cinema

Authors: Haniyeh Parhizkar

Abstract:

When Sarris came up with the "auteur theory" in 1962, he emphasized that the utmost premise of auteur theory is the inner meanings and concepts of a film and that a film is purely an art form. Today's mainstream movies are conceptually closer to what the Frankfurt School scholars regarded as "reproduced" and "mass culture" years ago. Hollywood goes on to be a huge movie-making machine that leads the dominant paradigms of films throughout the world and cinema is far from art. Although there are still movies, directors, and audiences who favor art cinema over Hollywood and mainstream movies, it's an almost undeniable fact that, for the most part, people's perception of movies is widely influenced by their American depiction and Hollywood's legacy of mass culture. With the uprising of Hollywood studios as the forerunners of the movie industry and cinema being largely dependent on economics rather than artistic values, this distinctive role of cinema has diminished and is replaced with a global standard. The Blockbuster 2022 film, 'Everything, Everywhere, All at Once' is now the most-awarded movie of all time, winning seven Oscars at the 95th Academy Awards. Despite its main cast being Asian, the movie is produced by American incorporation and is heavily influenced by Hollywood's dominant themes of superheroes, fantasy, action, and adventure. The New Yorker film critic, Richard Brody, called the movie "a pitch for a Marvel" and critiqued the film for being "universalized" and "empty of history and culture". Other critics of Variety pinpointed the movie's similarities to Marvel, particularly in their storylines of multi-universe which manifest traces of American legacy. As argued by these critics, 'Everything, Everywhere, All at Once' might appear as a unique and authentic film at first glance, but it can be argued that it is yet another version of a Marvel movie. While the movie's universal acclaim was regarded as recognition and an acknowledgment of its Asian cast, the issue that arises here is when the Hollywood influences and American themes are so robust in the film, is the movie industry honoring another culture or is it yet another celebration of Hollywood's dominant paradigm. This essay will employ a critical approach to Hollywood's dominance and mass-produced culture, which has deprived authenticity of non-American movies and is constantly reproducing the same formula of success.

Keywords: hollywoodization, universalization, blockbuster, dominant paradigm, marvel, authenticity, diversity

Procedia PDF Downloads 72
15842 Students’ Level of Knowledge Construction and Pattern of Social Interaction in an Online Forum

Authors: K. Durairaj, I. N. Umar

Abstract:

The asynchronous discussion forum is one of the most widely used activities in learning management system environment. Online forum allows participants to interact, construct knowledge, and can be used to complement face to face sessions in blended learning courses. However, to what extent do the students perceive the benefits or advantages of forum remain to be seen. Through content and social network analyses, instructors will be able to gauge the students’ engagement and knowledge construction level. Thus, this study aims to analyze the students’ level of knowledge construction and their participation level that occur through online discussion. It also attempts to investigate the relationship between the level of knowledge construction and their social interaction patterns. The sample involves 23 students undertaking a master course in one public university in Malaysia. The asynchronous discussion forum was conducted for three weeks as part of the course requirement. The finding indicates that the level of knowledge construction is quite low. Also, the density value of 0.11 indicating that the overall communication among the participants in the forum is low. This study reveals that strong and significant correlations between SNA measures (in-degree centrality, out-degree centrality) and level of knowledge construction. Thus, allocating these active students in a different groups aids the interactive discussion takes place. Finally, based upon the findings, some recommendations to increase students’ level of knowledge construction and also for further research are proposed.

Keywords: asynchronous discussion forums, content analysis, knowledge construction, social network analysis

Procedia PDF Downloads 358
15841 Corporate Governance Reforms in a Developing Economy: Making a Case for Upstream and Downstream Interventions

Authors: Franklin Nakpodia, Femi Olan

Abstract:

A blend of internal factors (firm performance, internal stakeholders) and external pressures (globalisation, technology, corporate scandals) have intensified calls for corporate governance reforms. While several countries and their governments have responded to these calls, the effect of such reforms on corporate governance systems across countries remains mixed. In particular, the literature reports that the effectiveness of corporate governance interventions in many developing economies is limited. Relying on the corporate governance system in Africa’s largest economy (Nigeria), this research addresses two issues. First, this study explores why previous corporate governance reforms have failed and second, the article investigates what reforms could improve corporate governance practices in the country. In addressing the above objectives, this study adopts a qualitative approach that permits data collection via semi-structured interviews with 21 corporate executives. The data supports the articulation of two sequential levels of reforms (i.e., the upstream and downstream reforms). The upstream reforms focus on two crucial but often overlooked areas that undermine reform effectiveness, i.e., the extent of government commitment and an enabling environment. The downstream reforms combine awareness and regulatory elements to proffer a path to robust corporate governance in the country. Furthermore, findings from this study stress the need to consider the use of a bottom-up approach to corporate governance practice and policymaking in place of the dominant top-down strategy.

Keywords: bottom-up approach, corporate governance, reforms, regulation

Procedia PDF Downloads 184
15840 Genotoxicity of 4-Nonylphenol (4NP) on Oreochromus spilurs Fish

Authors: M. M. Alsharif

Abstract:

4-Nonylphenol Compound is widely used as an element of detergents, paints, insecticides and many others products. It is known that the existence of this compound may lead to the emission of estrogenic responses in mammals, birds and fish. It is described as pollutant since it causes disorder of endocrine glands. In previous studies, it was proven that this compound exists in water and in the materials precipitated in Red Sea coast in Jeddah near the drains of processed drainage water and near the drainage site of the residuals of paper factories. Therefore, this study aimed to evaluate the cytogenetic aberrations caused by 4-nonylphenol through exposing Talapia Fishes to aquatic solution of the compound with 0, 15, 30 microgram/liter for one month. Samples of gills and liver were collected for micronuclei, nuclear abnormalities and measuring DNA and RNA amount in the treated fish. The results pointed out that there is a significant increase in the numbers of micronuclei in the fish exposed to the former concentrations as compared to the control group. Exposing fishes to 4-nonylphenol resulted in an increased amount of both DNA and RNA, compared to the control group. There is a positive correlation between the amount of the compound (i.e. dosage dependent effect) and the inspiring for cytogenetic effect on Talapia fishes in Jeddah. Therefore, micronucleus test, DNA and RNA contents can be considered as an index of cumulative exposure, which appear to be a sensitive model to evaluate genotoxic effects of 4-Nonylphenol compound on fish.

Keywords: genotoxic, 4-nonylphenol, micronuclei, fish, DNA, RNA

Procedia PDF Downloads 291
15839 A Rapid and Cost-Effective Approach to Manufacturing Modeling Platform for Fused Deposition Modeling

Authors: Chil-Chyuan Kuo, Chen-Hsuan Tsai

Abstract:

This study presents a cost-effective approach for rapid fabricating modeling platforms utilized in fused deposition modeling system. A small-batch production of modeling platforms about 20 pieces can be obtained economically through silicone rubber mold using vacuum casting without applying the plastic injection molding. The air venting systems is crucial for fabricating modeling platform using vacuum casting. Modeling platforms fabricated can be used for building rapid prototyping model after sandblasting. This study offers industrial value because it has both time-effectiveness and cost-effectiveness.

Keywords: vacuum casting, fused deposition modeling, modeling platform, sandblasting, surface roughness

Procedia PDF Downloads 368
15838 Exploring the Use of Adverbs in Two Young Learners Written Corpora

Authors: Chrysanthi S. Tiliakou, Katerina T. Frantzi

Abstract:

Writing has always been considered a most demanding skill for English as a Foreign Language learners as well as for native speakers. Novice foreign language writers are asked to handle a limited range of vocabulary to produce writing tasks at lower levels. Adverbs are the parts of speech that are not used extensively in the early stages of English as a Foreign Language writing. An additional problem with learning new adverbs is that, next to learning their meanings, learners are expected to acquire the proper placement of adverbs in a sentence. The use of adverbs is important as they enhance “expressive richness to one’s message”. By exploring the patterns of use of adverbs, researchers and educators can identify types of adverbs, which appear more taxing for young learners or that puzzle novice English as a Foreign Language writers with their placement, and focus on their teaching. To this end, the study examines the use of adverbs on two written Corpora of young learners of English of A1 – A2 levels and determines the types of adverbs used, their frequencies, problems in their use, and whether there is any differentiation between levels. The Antconc concordancing tool was used for the Greek Learner Corpus, and the Corpuscle concordancing tool for the Norwegian Corpus. The research found a similarity in the normalized frequencies of the adverbs used in the A1-A2 level Greek Learner Corpus with the frequencies of the same adverbs in the Norwegian Learner Corpus.

Keywords: learner corpora, young learners, writing, use of adverbs

Procedia PDF Downloads 80
15837 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification

Authors: Bing Li, Zhi Li, Yilong Yang

Abstract:

Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.

Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery

Procedia PDF Downloads 123
15836 Trajectory Tracking of a 2-Link Mobile Manipulator Using Sliding Mode Control Method

Authors: Abolfazl Mohammadijoo

Abstract:

In this paper, we are investigating the sliding mode control approach for trajectory tracking of a two-link-manipulator with a wheeled mobile robot in its base. The main challenge of this work is the dynamic interaction between mobile base and manipulator, which makes trajectory tracking more difficult than n-link manipulators with a fixed base. Another challenging part of this work is to avoid from chattering phenomenon of sliding mode control that makes lots of damages for actuators in real industrial cases. The results show the effectiveness of the sliding mode control approach for the desired trajectory.

Keywords: mobile manipulator, sliding mode control, dynamic interaction, mobile robotics

Procedia PDF Downloads 173
15835 A Cross-Cultural Approach for Communication with Biological and Non-Biological Intelligences

Authors: Thomas Schalow

Abstract:

This paper posits the need to take a cross-cultural approach to communication with non-human cultures and intelligences in order to meet the following three imminent contingencies: communicating with sentient biological intelligences, communicating with extraterrestrial intelligences, and communicating with artificial super-intelligences. The paper begins with a discussion of how intelligence emerges. It disputes some common assumptions we maintain about consciousness, intention, and language. The paper next explores cross-cultural communication among humans, including non-sapiens species. The next argument made is that we need to become much more serious about communicating with the non-human, intelligent life forms that already exist around us here on Earth. There is an urgent need to broaden our definition of communication and reach out to the other sentient life forms that inhabit our world. The paper next examines the science and philosophy behind CETI (communication with extraterrestrial intelligences) and how it has proven useful, even in the absence of contact with alien life. However, CETI’s assumptions and methodology need to be revised and based on the cross-cultural approach to communication proposed in this paper if we are truly serious about finding and communicating with life beyond Earth. The final theme explored in this paper is communication with non-biological super-intelligences using a cross-cultural communication approach. This will present a serious challenge for humanity, as we have never been truly compelled to converse with other species, and our failure to seriously consider such intercourse has left us largely unprepared to deal with communication in a future that will be mediated and controlled by computer algorithms. Fortunately, our experience dealing with other human cultures can provide us with a framework for this communication. The basic assumptions behind intercultural communication can be applied to the many types of communication envisioned in this paper if we are willing to recognize that we are in fact dealing with other cultures when we interact with other species, alien life, and artificial super-intelligence. The ideas considered in this paper will require a new mindset for humanity, but a new disposition will prepare us to face the challenges posed by a future dominated by artificial intelligence.

Keywords: artificial intelligence, CETI, communication, culture, language

Procedia PDF Downloads 341
15834 Diagnostics and Explanation of the Current Status of the 40- Year Railway Viaduct

Authors: Jakub Zembrzuski, Bartosz Sobczyk, Mikołaj MIśkiewicz

Abstract:

Besides designing new constructions, engineers all over the world must face another problem – maintenance, repairs, and assessment of the technical condition of existing bridges. To solve more complex issues, it is necessary to be familiar with the theory of finite element method and to have access to the software that provides sufficient tools which to enable create of sometimes significantly advanced numerical models. The paper includes a brief assessment of the technical condition, a description of the in situ non-destructive testing carried out and the FEM models created for global and local analysis. In situ testing was performed using strain gauges and displacement sensors. Numerical models were created using various software and numerical modeling techniques. Particularly noteworthy is the method of modeling riveted joints of the crossbeam of the viaduct. It is a simplified method that consists of the use of only basic numerical tools such as beam and shell finite elements, constraints, and simplified boundary conditions (fixed support and symmetry). The results of the numerical analyses were presented and discussed. It is clearly explained why the structure did not fail, despite the fact that the weld of the deck plate completely failed. A further research problem that was solved was to determine the cause of the rapid increase in values on the stress diagram in the cross-section of the transverse section. The problems were solved using the solely mentioned, simplified method of modeling riveted joints, which demonstrates that it is possible to solve such problems without access to sophisticated software that enables to performance of the advanced nonlinear analysis. Moreover, the obtained results are of great importance in the field of assessing the operation of bridge structures with an orthotropic plate.

Keywords: bridge, diagnostics, FEM simulations, failure, NDT, in situ testing

Procedia PDF Downloads 58
15833 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses

Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau

Abstract:

Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.

Keywords: exam length, psychometric criteria, synthetic experimental designs, test length

Procedia PDF Downloads 260