Search results for: intergroup recognition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1712

Search results for: intergroup recognition

332 Microfacies Analysis and Paleoenvironmental Trends of the Paleocene Farrud and Mabruk Reservoirs, Concession 11, West Sirte Basin, Libya

Authors: Nisreen Agha

Abstract:

Investigation of representative core samples under the petrological microscope reveals common petrographic and mineralogical characteristics with distinct faunal assemblages, allowing establishing of the microfacies associations and deducing the paleoenvironmental trends of the Paleocene Farrud and Mabruk rock units. Recognition of the early and post-diagenetic processes, particularly dolomitization and micritization, as well as dissolution and precipitation of spary drusy calcite as a new morphism process affecting the reservoir rocks, is established. The microfacies trends detected from the investigation of 46 core samples from Farrud member (Lower Paleocene) representing six wells; QQQ1-11, GG1-11, LLL1-11, RRR1-11, RRR40-11, and RRR45-11 indicate that the deposition was started within the realm of shallow supratidal and intertidal subenvironments followed by deeper environments of the shelf bays with maximum sea level during inner shelf environment where fossiliferous bioclastic packstone dominated. The microfacies associations determined in 8 core samples from two wells LLL1and RRR40 representing Mabruk Member (Upper Paleocene), indicate paleoenvironmental trends marked by sea level fluctuations accompanied with a relatively marine shelf bay conditions intervened with short-lived shallow intertidal and supratidal warm coastal sedimentation. As a result, dolostone, evaporitic dismicrites, and gypsiferous dolostone of supratidal characters were deposited. They reflect rapid oscillation of the sea level marked by drop land-ward shift of the sea shore deposition prevailed by supratidal gypsiferous dolostone and numerous ferruginous materials as clouds straining many parts of dolomite and surrounded the micritized fossils. This situation ends the deposition of the Farrud Member in most of the studied wells. On the other hand, the facies in the northern part of the Concession -11 field indicates deposition in a deeper marine setting than in the southern facies.

Keywords: Farrud and Mabruk members, paleocene, microfacies associations, diagenesis, sea level oscillation, depositional environments

Procedia PDF Downloads 76
331 Affective Transparency in Compound Word Processing

Authors: Jordan Gallant

Abstract:

In the compound word processing literature, much attention has been paid to the relationship between a compound’s denotational meaning and that of its morphological whole-word constituents, which is referred to as ‘semantic transparency’. However, the parallel relationship between a compound’s connotation and that of its constituents has not been addressed at all. For instance, while a compound like ‘painkiller’ might be semantically transparent, it is not ‘affectively transparent’. That is, both constituents have primarily negative connotations, while the whole compound has a positive one. This paper investigates the role of affective transparency on compound processing using two methodologies commonly employed in this field: a lexical decision task and a typing task. The critical stimuli used were 112 English bi-constituent compounds that differed in terms of the effective transparency of their constituents. Of these, 36 stimuli contained constituents with similar connotations to the compound (e.g., ‘dreamland’), 36 contained constituents with more positive connotations (e.g. ‘bedpan’), and 36 contained constituents with more negative connotations (e.g. ‘painkiller’). Connotation of whole-word constituents and compounds were operationalized via valence ratings taken from an off-line ratings database. In Experiment 1, compound stimuli and matched non-word controls were presented visually to participants, who were then asked to indicate whether it was a real word in English. Response times and accuracy were recorded. In Experiment 2, participants typed compound stimuli presented to them visually. Individual keystroke response times and typing accuracy were recorded. The results of both experiments provided positive evidence that compound processing is influenced by effective transparency. In Experiment 1, compounds in which both constituents had more negative connotations than the compound itself were responded to significantly more slowly than compounds in which the constituents had similar or more positive connotations. Typed responses from Experiment 2 showed that inter-keystroke intervals at the morphological constituent boundary were significantly longer when the connotation of the head constituent was either more positive or more negative than that of the compound. The interpretation of this finding is discussed in the context of previous compound typing research. Taken together, these findings suggest that affective transparency plays a role in the recognition, storage, and production of English compound words. This study provides a promising first step in a new direction for research on compound words.

Keywords: compound processing, semantic transparency, typed production, valence

Procedia PDF Downloads 127
330 “It Isn’t a State Problem”: The Minas Conga Mine Controversy and Exemplifying the Need for Binding International Obligations on Corporate Actors

Authors: Cindy Woods

Abstract:

After years of implacable neoliberal globalization, multinational corporations have moved from the periphery to the center of the international legal agenda. Human rights advocates have long called for greater corporate accountability in the international arena. The creation of the Global Compact in 2000, while aimed at fostering greater corporate respect for human rights, did not silence these calls. After multiple unsuccessful attempts to adopt a set of norms relating to the human rights responsibilities of transnational corporations, the United Nations succeeded in 2008 with the Guiding Principles on Business and Human Rights (Guiding Principles). The Guiding Principles, praised by some within the international human rights community for their recognition of an individual corporate responsibility to respect human rights, have not escaped their share of criticism. Many view the Guiding Principles to be toothless, failing to directly impose obligations upon corporations, and call for binding international obligations on corporate entities. After decades of attempting to promulgate human rights obligations for multinational corporations, the existing legal frameworks in place fall short of protecting individuals from the human rights abuses of multinational corporations. The Global Compact and Guiding Principles are proof of the United Nations’ unwillingness to impose international legal obligations on corporate actors. In June 2014, the Human Rights Council adopted a resolution to draft international legally binding human rights norms for business entities; however, key players in the international arena have already announced they will not cooperate with such efforts. This Note, through an overview of the existing corporate accountability frameworks and a study of Newmont Mining’s Minas Conga project in Peru, argues that binding international human rights obligations on corporations are necessary to fully protect human rights. Where states refuse to or simply cannot uphold their duty to protect individuals from transnational businesses’ human rights transgressions, there must exist mechanisms to pursue justice directly against the multinational corporation.

Keywords: business and human rights, Latin America, international treaty on business and human rights, mining, human rights

Procedia PDF Downloads 499
329 Co-produced Databank of Tailored Messages to Support Enagagement to Digitial Health Interventions

Authors: Menna Brown, Tania Domun

Abstract:

Digital health interventions are effective across a wide array of health conditions spanning physical health, lifestyle behaviour change, and mental health and wellbeing; furthermore, they are rapidly increasing in volume within both the academic literature and society as commercial apps continue to proliferate the digital health market. However, adherence and engagement to digital health interventions remains problematic. Technology-based personalised and tailored reminder strategies can support engagement to digital health interventions. Interventions which support individuals’ mental health and wellbeing are of critical importance in the wake if the COVID-19 pandemic. Student and young person’s mental health has been negatively affected and digital resources continue to offer cost effective means to address wellbeing at a population level. Develop a databank of digital co-produced tailored messages to support engagement to a range of digital health interventions including those focused on mental health and wellbeing, and lifestyle behaviour change. Qualitative research design. Participants discussed their views of health and wellbeing, engagement and adherence to digital health interventions focused around a 12-week wellbeing intervention via a series of focus group discussions. They worked together to co-create content following a participatory design approach. Three focus group discussions were facilitated with (n=15) undergraduate students at one Welsh university to provide an empirically derived, co-produced, databank of (n=145) tailored messages. Messages were explored and categorised thematically, and the following ten themes emerged: Autonomy, Recognition, Guidance, Community, Acceptance, Responsibility, Encouragement, Compassion, Impact and Ease. The findings provide empirically derived, co-produced tailored messages. These have been made available for use, via ‘ACTivate your wellbeing’ a digital, automated, 12-week health and wellbeing intervention programme, based on acceptance and commitment therapy (ACT). The purpose of which is to support future research to evaluate the impact of thematically categorised tailored messages on engagement and adherence to digital health interventions.

Keywords: digital health, engagement, wellbeing, participatory design, positive psychology, co-production

Procedia PDF Downloads 121
328 Simulation of Elastic Bodies through Discrete Element Method, Coupled with a Nested Overlapping Grid Fluid Flow Solver

Authors: Paolo Sassi, Jorge Freiria, Gabriel Usera

Abstract:

In this work, a finite volume fluid flow solver is coupled with a discrete element method module for the simulation of the dynamics of free and elastic bodies in interaction with the fluid and between themselves. The open source fluid flow solver, caffa3d.MBRi, includes the capability to work with nested overlapping grids in order to easily refine the grid in the region where the bodies are moving. To do so, it is necessary to implement a recognition function able to identify the specific mesh block in which the device is moving in. The set of overlapping finer grids might be displaced along with the set of bodies being simulated. The interaction between the bodies and the fluid is computed through a two-way coupling. The velocity field of the fluid is first interpolated to determine the drag force on each object. After solving the objects displacements, subject to the elastic bonding among them, the force is applied back onto the fluid through a Gaussian smoothing considering the cells near the position of each object. The fishnet is represented as lumped masses connected by elastic lines. The internal forces are derived from the elasticity of these lines, and the external forces are due to drag, gravity, buoyancy and the load acting on each element of the system. When solving the ordinary differential equations system, that represents the motion of the elastic and flexible bodies, it was found that the Runge Kutta solver of fourth order is the best tool in terms of performance, but requires a finer grid than the fluid solver to make the system converge, which demands greater computing power. The coupled solver is demonstrated by simulating the interaction between the fluid, an elastic fishnet and a set of free bodies being captured by the net as they are dragged by the fluid. The deformation of the net, as well as the wake produced in the fluid stream are well captured by the method, without requiring the fluid solver mesh to adapt for the evolving geometry. Application of the same strategy to the simulation of elastic structures subject to the action of wind is also possible with the method presented, and one such application is currently under development.

Keywords: computational fluid dynamics, discrete element method, fishnets, nested overlapping grids

Procedia PDF Downloads 416
327 Exploring the Rhinoceros Beetles of a Tropical Forest of Eastern Himalayas

Authors: Subhankar Kumar Sarkar

Abstract:

Beetles of the subfamily Dynastinae under the family Scarabaeidae of the insect order Coleoptera are popularly known as ‘Rhinoceros beetles’ because of the characteristic horn borne by the males on their head. These horns are dedicated in mating battle against other males and have evolved as a result of phenotypic plasticity. Scarabaeidae is the largest of all families under Coleoptera and is composed of 11 subfamilies, of which the subfamily Dynastinae is represented by approximately 300 species. Some of these beetles have been reported to cause considerable damage to agriculture and forestry both in their larval and adult stages, while many of them are beneficial as they pollinate plants and recycle plant materials. Eastern Himalayas is regarded as one of the 35 biodiversity hotspot zones of the world and one of the four of India, which is exhibited by its rich and megadiverse tropical forests. However, our knowledge on the faunal diversity of these forests is very limited, particularly for the insect fauna. One such tropical forest of Eastern Himalayas is the ‘Buxa Tiger Reserve’ located between latitudes 26°30” to 26°55” North and Longitudes 89°20” to 89˚35” East of India and occupies an area of about 759.26 square kilometers. It is with this background an attempt has been made to explore the insect fauna of the forest. Insect sampling was carried out in each beat and range of Buxa Tiger Reserve in all the three seasons viz, Premonsoon, Monsoon, and Postmonsoon. Sample collections were done by sweep nets, hand picking technique and pit fall traps. UV light trap was used to collect the nocturnal insects. Morphological examinations of the collected samples were carried out with Stereozoom Binocular Microscopes (Zeiss SV6 and SV11) and were identified up to species level with the aid of relevant literature. Survey of the insect fauna of the forest resulted in the recognition of 76 scarab species, of which 8 belong to the subfamily dealt herein. Each of the 8 species represents a separate genus. The forest is dominated by the members of Xylotrupes gideon (Linnaeus) as is represented by highest number of individuals. The recorded taxa show about 12% endemism and are of mainly oriental in distribution. Premonsoon is the most favorable season for their occurrence and activity followed by Monsoon and Postmonsoon.

Keywords: Dynastinae, Scarabaeidae, diversity, Buxa Tiger Reserve

Procedia PDF Downloads 189
326 A Comprehensive Framework for Fraud Prevention and Customer Feedback Classification in E-Commerce

Authors: Samhita Mummadi, Sree Divya Nagalli, Harshini Vemuri, Saketh Charan Nakka, Sumesh K. J.

Abstract:

One of the most significant challenges faced by people in today’s digital era is an alarming increase in fraudulent activities on online platforms. The fascination with online shopping to avoid long queues in shopping malls, the availability of a variety of products, and home delivery of goods have paved the way for a rapid increase in vast online shopping platforms. This has had a major impact on increasing fraudulent activities as well. This loop of online shopping and transactions has paved the way for fraudulent users to commit fraud. For instance, consider a store that orders thousands of products all at once, but what’s fishy about this is the massive number of items purchased and their transactions turning out to be fraud, leading to a huge loss for the seller. Considering scenarios like these underscores the urgent need to introduce machine learning approaches to combat fraud in online shopping. By leveraging robust algorithms, namely KNN, Decision Trees, and Random Forest, which are highly effective in generating accurate results, this research endeavors to discern patterns indicative of fraudulent behavior within transactional data. Introducing a comprehensive solution to this problem in order to empower e-commerce administrators in timely fraud detection and prevention is the primary motive and the main focus. In addition to that, sentiment analysis is harnessed in the model so that the e-commerce admin can tailor to the customer’s and consumer’s concerns, feedback, and comments, allowing the admin to improve the user’s experience. The ultimate objective of this study is to ramp up online shopping platforms against fraud and ensure a safer shopping experience. This paper underscores a model accuracy of 84%. All the findings and observations that were noted during our work lay the groundwork for future advancements in the development of more resilient and adaptive fraud detection systems, which will become crucial as technologies continue to evolve.

Keywords: behavior analysis, feature selection, Fraudulent pattern recognition, imbalanced classification, transactional anomalies

Procedia PDF Downloads 26
325 Deploying a Transformative Learning Model in Technological University Dublin to Assess Transversal Skills

Authors: Sandra Thompson, Paul Dervan

Abstract:

Ireland’s first Technological University (TU Dublin) was established on 1st January 2019, and its creation is an exciting new milestone in Irish Higher Education. TU Dublin is now Ireland’s biggest University supporting 29,000 students across three campuses with 3,500 staff. The University aspires to create work-ready graduates who are socially responsible, open-minded global thinkers who are ambitious to change the world for the better. As graduates, they will be enterprising and daring in all their endeavors, ready to play their part in transforming the future. Feedback from Irish employers and students coupled with evidence from other authoritative sources such as the World Economic Forum points to a need for greater focus on the development of students’ employability skills as they prepare for today’s work environment. Moreover, with an increased focus on Universal Design for Learning (UDL) and inclusiveness, there is recognition that students are more than a numeric grade value. Robust grading systems have been developed to track a student’s performance around discipline knowledge but there is little or no global consensus on a definition of transversal skills nor on a unified framework to assess transversal skills. Education and industry sectors are often assessing one or two skills, and some are developing their own frameworks to capture the learner’s achievement in this area. Technological University Dublin (TU Dublin) have discovered and implemented a framework to allow students to develop, assess and record their transversal skills using transformative learning theory. The model implemented is an adaptation of Student Transformative Learning Record - STLR which originated in the University of Central Oklahoma (UCO). The purpose of this paper therefore, is to examine the views of students, staff and employers in the context of deploying a Transformative Learning model within the University to assess transversal skills. It will examine the initial impact the transformative learning model is having socially, personally and on the University as an organization. Crucially also, to identify lessons learned from the deployment in order to assist other Universities and Higher Education Institutes who may be considering a focused adoption of Transformative Learning to meet the challenge of preparing students for today’s work environment.

Keywords: assessing transversal skills, higher education, transformative learning, students

Procedia PDF Downloads 128
324 “Everything, Everywhere, All at Once” Hollywoodization and Lack of Authenticity in Today’s Mainstream Cinema

Authors: Haniyeh Parhizkar

Abstract:

When Sarris came up with the "auteur theory" in 1962, he emphasized that the utmost premise of auteur theory is the inner meanings and concepts of a film and that a film is purely an art form. Today's mainstream movies are conceptually closer to what the Frankfurt School scholars regarded as "reproduced" and "mass culture" years ago. Hollywood goes on to be a huge movie-making machine that leads the dominant paradigms of films throughout the world and cinema is far from art. Although there are still movies, directors, and audiences who favor art cinema over Hollywood and mainstream movies, it's an almost undeniable fact that, for the most part, people's perception of movies is widely influenced by their American depiction and Hollywood's legacy of mass culture. With the uprising of Hollywood studios as the forerunners of the movie industry and cinema being largely dependent on economics rather than artistic values, this distinctive role of cinema has diminished and is replaced with a global standard. The Blockbuster 2022 film, 'Everything, Everywhere, All at Once' is now the most-awarded movie of all time, winning seven Oscars at the 95th Academy Awards. Despite its main cast being Asian, the movie is produced by American incorporation and is heavily influenced by Hollywood's dominant themes of superheroes, fantasy, action, and adventure. The New Yorker film critic, Richard Brody, called the movie "a pitch for a Marvel" and critiqued the film for being "universalized" and "empty of history and culture". Other critics of Variety pinpointed the movie's similarities to Marvel, particularly in their storylines of multi-universe which manifest traces of American legacy. As argued by these critics, 'Everything, Everywhere, All at Once' might appear as a unique and authentic film at first glance, but it can be argued that it is yet another version of a Marvel movie. While the movie's universal acclaim was regarded as recognition and an acknowledgment of its Asian cast, the issue that arises here is when the Hollywood influences and American themes are so robust in the film, is the movie industry honoring another culture or is it yet another celebration of Hollywood's dominant paradigm. This essay will employ a critical approach to Hollywood's dominance and mass-produced culture, which has deprived authenticity of non-American movies and is constantly reproducing the same formula of success.

Keywords: hollywoodization, universalization, blockbuster, dominant paradigm, marvel, authenticity, diversity

Procedia PDF Downloads 88
323 A Semantic and Concise Structure to Represent Human Actions

Authors: Tobias Strübing, Fatemeh Ziaeetabar

Abstract:

Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.

Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis

Procedia PDF Downloads 126
322 DNA Nano Wires: A Charge Transfer Approach

Authors: S. Behnia, S. Fathizadeh, A. Akhshani

Abstract:

In the recent decades, DNA has increasingly interested in the potential technological applications that not directly related to the coding for functional proteins that is the expressed in form of genetic information. One of the most interesting applications of DNA is related to the construction of nanostructures of high complexity, design of functional nanostructures in nanoelectronical devices, nanosensors and nanocercuits. In this field, DNA is of fundamental interest to the development of DNA-based molecular technologies, as it possesses ideal structural and molecular recognition properties for use in self-assembling nanodevices with a definite molecular architecture. Also, the robust, one-dimensional flexible structure of DNA can be used to design electronic devices, serving as a wire, transistor switch, or rectifier depending on its electronic properties. In order to understand the mechanism of the charge transport along DNA sequences, numerous studies have been carried out. In this regard, conductivity properties of DNA molecule could be investigated in a simple, but chemically specific approach that is intimately related to the Su-Schrieffer-Heeger (SSH) model. In SSH model, the non-diagonal matrix element dependence on intersite displacements is considered. In this approach, the coupling between the charge and lattice deformation is along the helix. This model is a tight-binding linear nanoscale chain established to describe conductivity phenomena in doped polyethylene. It is based on the assumption of a classical harmonic interaction between sites, which is linearly coupled to a tight-binding Hamiltonian. In this work, the Hamiltonian and corresponding motion equations are nonlinear and have high sensitivity to initial conditions. Then, we have tried to move toward the nonlinear dynamics and phase space analysis. Nonlinear dynamics and chaos theory, regardless of any approximation, could open new horizons to understand the conductivity mechanism in DNA. For a detailed study, we have tried to study the current flowing in DNA and investigated the characteristic I-V diagram. As a result, It is shown that there are the (quasi-) ohmic areas in I-V diagram. On the other hand, the regions with a negative differential resistance (NDR) are detectable in diagram.

Keywords: DNA conductivity, Landauer resistance, negative di erential resistance, Chaos theory, mean Lyapunov exponent

Procedia PDF Downloads 425
321 Bank Liquidity Creation in a Dual Banking System: An Empirical Investigation

Authors: Lianne M. Q. Lee, Mohammed Sharaf Shaiban

Abstract:

The importance of bank liquidity management took center stage as policy makers promoted a more resilient global banking system after the market turmoil of 2007. The growing recognition of Islamic banks’ function of intermediating funds in the economy warrants the need to investigate its balance sheet structure which is distinct from its conventional counterparts. Given that asymmetric risk, transformation is inevitable; Islamic banks need to identify the liquidity risk within their distinctive balance sheet structure. Thus, there is a strong need to quantify and assess the liquidity position to ensure proper functioning of a financial institution. It is vital to measure bank liquidity because liquid banks face less liquidity risk. We examine this issue by using two alternative quantitative measures of liquidity creation “cat fat” and “cat nonfat” constructed by Berger and Bouwman (2009). “Cat fat” measures all on balance sheet items including off balance sheet, whilst the latter measures only on balance sheet items. Liquidity creation is measured over the period 2007-2014 in 14 countries where Islamic and conventional commercial banks coexist. Also, separately by bank size class as empirical studies have shown that liquidity creation varies by bank size. An interesting and important finding shows that all size class of Islamic banks, on average have increased creation of aggregate liquidity in real dollar terms over the years for both liquidity creation measures especially for large banks indicating that Islamic banks actually generates more liquidity to the economy compared to its conventional counterparts, including from off-balance sheet items. The liquidity creation for off-balance sheets by conventional banks may have been affected by the global financial crisis when derivatives markets were severely hit. The results also suggest that Islamic banks have the higher volume of assets and deposits and that borrowing/issues of bonds are less in Islamic banks compared to conventional banks because most products are interest-based. As Islamic banks appear to create more liquidity than conventional banks under both measures, it translates that the development of Islamic banking is significant over the decades since its inception. This finding is encouraging as, despite Islamic banking’s overall size, it represents growth opportunities for these countries.

Keywords: financial institution, liquidity creation, liquidity risk, policy and regulation

Procedia PDF Downloads 349
320 Turn-Taking and Leading Roles in Early Cognition: Interaction of Social Cognition and Language in Development

Authors: Zsuzsanna Schnell, Francesca Ervas

Abstract:

Background: Our study aims to clarify how language fosters further cognitive development and how we eventually arrive at the complex human specific skill of pragmatic competence and reveal what levels of mentalization and theory of mind are in place before language. Method: Our experimental pragmatic investigation maps the interaction of mentalization and pragmatic competence. We map the different levels of mentalization that empower different levels of pragmatic meaning construction and evaluate the results with statistical analysis (MannWhitney and ANOVA). Analyzing the comprehension of literal and non-compositional (figurative) utterances, we apply linguistic trials, among them metaphor-, irony-, irony with surface cue-, humor- and the recognition of maxim infringements trial in neurotypical (NT) preschoolers with a coherent and comparative methodology. Results: The findings reveal the relationship and direction of interaction between Language and theory of mind. On the one hand social-cognitive skills enhance, facilitate and provide a basis for language acquisition, and in return linguistic structures (DeVilliers 2000, 2007) provide a framework for further development of mentalizing skills. Conclusions: Findings confirm that this scaffolding becomes a mutually supportive system where language and social cognition develops in interaction. Certain stages in ToM development serve as a precursor of understanding grammatically complex sentences, like embedded phrases which mirror embedded mental states; which, in turn, facilitates the development of pragmatic competence, thus, the social use of language, integrating social, cognitive, linguistic and psychological factors in discourse. Future implications: Our investigation functions as a differential-diagnostic measure, with typically developing results thus serve as a baseline in further empirical research for atypical cases. This enables the study of populations where language and ToM development is disturbed, reveals how language and ToM are acquired and interact, and gives an insight into what this has to do with clinical symptoms. This in turn can reveal the causal link to the syndrome at hand, which can set directions for therapeutic development and training.

Keywords: theory of mind, language development, mentalization, language philosophy, experimental pragmatics

Procedia PDF Downloads 29
319 Linguistic Competencies of Students with Hearing Impairment

Authors: Munawar Malik, Muntaha Ahmad, Khalil Ullah Khan

Abstract:

Linguistic abilities in students with hearing impairment yet remain a concern for educationists. The emerging technological support and provisions in recent era vows to have addressed the situation and claims significant contribution in terms of linguistic repertoire. Being a descriptive and quantitative paradigm of study, the purpose of this research set forth was to assess linguistic competencies of students with hearing impairment in English language. The goals were further broken down to identify level of reading abilities in the subject population. The population involved students with HI studying at higher secondary level in Lahore. Simple random sampling technique was used to choose a sample of fifty students. A purposive curriculum-based assessment was designed in line with accelerated learning program by Punjab Government, to assess Linguistic competence among the sample. Further to it, an Informal Reading Inventory (IRI) corresponding to reading levels was also developed by researchers duly validated and piloted before the final use. Descriptive and inferential statistics were utilized to reach to the findings. Spearman’s correlation was used to find out relationship between degree of hearing loss, grade level, gender and type of amplification device. Independent sample t-test was used to compare means among groups. Major findings of the study revealed that students with hearing impairment exhibit significant deviation from the mean scores when compared in terms of grades, severity and amplification device. The study divulged that respective students with HI have yet failed to qualify an independent level of reading according to their grades as majority falls at frustration level of word recognition and passage comprehension. The poorer performance can be attributed to lower linguistic competence as it shows in the frustration levels of reading, writing and comprehension. The correlation analysis did reflect an improved performance grade wise, however scores could only correspond to frustration level and independent levels was never achieved. Reported achievements at instructional level of subject population may further to linguistic skills if practiced purposively.

Keywords: linguistic competence, hearing impairment, reading levels, educationist

Procedia PDF Downloads 67
318 Gestalt in Music and Brain: A Non-Linear Chaos Based Study with Detrended/Adaptive Fractal Analysis

Authors: Shankha Sanyal, Archi Banerjee, Sayan Biswas, Sourya Sengupta, Sayan Nag, Ranjan Sengupta, Dipak Ghosh

Abstract:

The term ‘gestalt’ has been widely used in the field of psychology which defined the perception of human mind to group any object not in part but as a 'unified' whole. Music, in general, is polyphonic - i.e. a combination of a number of pure tones (frequencies) mixed together in a manner that sounds harmonious. The study of human brain response due to different frequency groups of the acoustic signal can give us an excellent insight regarding the neural and functional architecture of brain functions. Hence, the study of music cognition using neuro-biosensors is becoming a rapidly emerging field of research. In this work, we have tried to analyze the effect of different frequency bands of music on the various frequency rhythms of human brain obtained from EEG data. Four widely popular Rabindrasangeet clips were subjected to Wavelet Transform method for extracting five resonant frequency bands from the original music signal. These frequency bands were initially analyzed with Detrended/Adaptive Fractal analysis (DFA/AFA) methods. A listening test was conducted on a pool of 100 respondents to assess the frequency band in which the music becomes non-recognizable. Next, these resonant frequency bands were presented to 20 subjects as auditory stimulus and EEG signals recorded simultaneously in 19 different locations of the brain. The recorded EEG signals were noise cleaned and subjected again to DFA/AFA technique on the alpha, theta and gamma frequency range. Thus, we obtained the scaling exponents from the two methods in alpha, theta and gamma EEG rhythms corresponding to different frequency bands of music. From the analysis of music signal, it is seen that loss of recognition is proportional to the loss of long range correlation in the signal. From the EEG signal analysis, we obtain frequency specific arousal based response in different lobes of brain as well as in specific EEG bands corresponding to musical stimuli. In this way, we look to identify a specific frequency band beyond which the music becomes non-recognizable and below which in spite of the absence of other bands the music is perceivable to the audience. This revelation can be of immense importance when it comes to the field of cognitive music therapy and researchers of creativity.

Keywords: AFA, DFA, EEG, gestalt in music, Hurst exponent

Procedia PDF Downloads 332
317 Task Based Functional Connectivity within Reward Network in Food Image Viewing Paradigm Using Functional MRI

Authors: Preetham Shankapal, Jill King, Kori Murray, Corby Martin, Paula Giselman, Jason Hicks, Owen Carmicheal

Abstract:

Activation of reward and satiety networks in the brain while processing palatable food cues, as well as functional connectivity during rest has been studied using functional Magnetic Resonance Imaging of the brain in various obesity phenotypes. However, functional connectivity within the reward and satiety network during food cue processing is understudied. 14 obese individuals underwent two fMRI scans during viewing of Macronutrient Picture System images. Each scan included two blocks of images of High Sugar/High Fat (HSHF), High Carbohydrate/High Fat (HCHF), Low Sugar/Low Fat (LSLF) and also non-food images. Seed voxels within seven food reward relevant ROIs: Insula, putamen and cingulate, precentral, parahippocampal, medial frontal and superior temporal gyri were isolated based on a prior meta-analysis. Beta series correlation for task-related functional connectivity between these seed voxels and the rest of the brain was computed. Voxel-level differences in functional connectivity were calculated between: first and the second scan; individuals who saw novel (N=7) vs. Repeated (N=7) images in the second scan; and between the HC/HF, HSHF blocks vs LSLF and non-food blocks. Computations and analysis showed that during food image viewing, reward network ROIs showed significant functional connectivity with each other and with other regions responsible for attentional and motor control, including inferior parietal lobe and precentral gyrus. These functional connectivity values were heightened among individuals who viewed novel HS/HF images in the second scan. In the second scan session, functional connectivity was reduced within the reward network but increased within attention, memory and recognition regions, suggesting habituation to reward properties and increased recollection of previously viewed images. In conclusion it can be inferred that Functional Connectivity within reward network and between reward and other brain regions, varies by important experimental conditions during food photography viewing, including habituation to shown foods.

Keywords: fMRI, functional connectivity, task-based, beta series correlation

Procedia PDF Downloads 270
316 The Impact of the Application of Blockchain Technology in Accounting and Auditing

Authors: Yusuf Adebayo Oduwole

Abstract:

The evaluation of blockchain technology's potential effects on the accounting and auditing fields is the main objective of this essay. It also adds to the existing body of work by examining how these practices alter technological concerns, including cryptocurrency accounting, regulation, governance, accounting practices, and technical challenges. Examples of this advancement include the growth of the concept of blockchain and its application in accounting. This technology is being considered one of the digital revolutions that could disrupt the world and civilization as it can transfer large volumes of virtual currencies like cryptocurrencies with the help of a third party. The basis for this research is a systematic review of the articles using Vosviewer to display and reflect on the bibliometric information of the articles accessible on the Scopus database. Also, as the practice of using blockchain technology in the field of accounting and auditing is still in its infancy, it may be useful to carry out a more thorough analysis of any implications for accounting and auditing regarding aspects of governance, regulation, and cryptocurrency that have not yet been discussed or addressed to any significant extent. The main findings on the relationship between blockchain and accounting show that the application of smart contracts, such as triple-entry accounting, has increased the quality of accounting records as well as reliance on the information available. This results in fewer cyclical assignments, no need for resolution, and real-time accounting, among others. Thereby, to integrate blockchain through a computer system, one must continuously learn and remain naive when using blockchain-integrated accounting software. This includes learning about how cryptocurrencies are accounted for and regulated. In this study, three original and contributed efforts are presented. To offer a transparent view of the state of previous relevant studies and research works in accounting and auditing that focus on blockchain, it begins by using bibliographic visibility analysis and a Scopus narrative analysis. Second, it highlights legislative, governance, and ethical concerns, such as education, where it tackles the use of blockchain in accounting and auditing. Lastly, it examines the impact of blockchain technologies on the accounting recognition of cryptocurrencies. Users of the technology should, therefore, take their time and learn how it works, as well as keep abreast of the different developments. In addition, the accounting industry must integrate blockchain certification and practice, most likely offline or as part of university education for those intending to become auditors or accountants.

Keywords: blockchain, crypto assets, governance, regulation & smart contracts

Procedia PDF Downloads 27
315 Interior Architecture in the Anthropocene: Engaging the Subnature through the Intensification of Body-Surface Interaction

Authors: Verarisa Ujung

Abstract:

The Anthropocene – as scientists define as a new geological epoch where human intervention has the dominant influence on the geological, atmospheric, and ecological processes challenges the contemporary discourse in architecture and interior. The dominant influence characterises the incapability to distinguish the notion of nature, subnature, human and non-human. Consequently, living in the Anthropocene demands sensitivity and responsiveness to heighten our sense of the rhythm of transformation and recognition of our environment as a product of natural, social and historical processes. The notion of subnature is particularly emphasised in this paper to investigate the poetic sense of living with subnature. It could be associated with the critical tool for exploring the aesthetic and programmatic implications of subnature on interiority. The ephemeral immaterial attached to subnature promotes the sense of atmospheric delineation of interiority, the very inner significance of body-surface interaction, which central to interior architecture discourse. This would then reflect human’s activities; examine the transformative change, the architectural motion and the traces that left between moments. In this way, engaging the notion of subnature enable us to better understand the critical subject on interiority and might provide an in-depth study on interior architecture. Incorporating the exploration on the form, materiality, and pattern of subnature, this research seeks to grasp the inner significance of micro to macro approaches so that the future of interior might be compelled to depend more on the investigation and development of responsive environment. To reflect upon the form, materiality and intensity of subnature that specifically characterized by the natural, social and historical processes, this research examines a volcanic land, White Island/Whakaari, New Zealand as the chosen site of investigation. Emitting various forms and intensities of subnatures - smokes, mud, sulphur gas, this volcanic land also open to the new inhabitation within the sulphur factory ruins that reflects human’s past occupation. In this way, temporal and natural selected manifestations of materiality, artefact, and performance can be traced out and might reveal the meaningful relations among space, inhabitation, and well-being of inhabitants in the Anthropocene.

Keywords: anthropocene, body, intensification, intensity, interior architecture, subnature, surface

Procedia PDF Downloads 176
314 Fast Estimation of Fractional Process Parameters in Rough Financial Models Using Artificial Intelligence

Authors: Dávid Kovács, Bálint Csanády, Dániel Boros, Iván Ivkovic, Lóránt Nagy, Dalma Tóth-Lakits, László Márkus, András Lukács

Abstract:

The modeling practice of financial instruments has seen significant change over the last decade due to the recognition of time-dependent and stochastically changing correlations among the market prices or the prices and market characteristics. To represent this phenomenon, the Stochastic Correlation Process (SCP) has come to the fore in the joint modeling of prices, offering a more nuanced description of their interdependence. This approach has allowed for the attainment of realistic tail dependencies, highlighting that prices tend to synchronize more during intense or volatile trading periods, resulting in stronger correlations. Evidence in statistical literature suggests that, similarly to the volatility, the SCP of certain stock prices follows rough paths, which can be described using fractional differential equations. However, estimating parameters for these equations often involves complex and computation-intensive algorithms, creating a necessity for alternative solutions. In this regard, the Fractional Ornstein-Uhlenbeck (fOU) process from the family of fractional processes offers a promising path. We can effectively describe the rough SCP by utilizing certain transformations of the fOU. We employed neural networks to understand the behavior of these processes. We had to develop a fast algorithm to generate a valid and suitably large sample from the appropriate process to train the network. With an extensive training set, the neural network can estimate the process parameters accurately and efficiently. Although the initial focus was the fOU, the resulting model displayed broader applicability, thus paving the way for further investigation of other processes in the realm of financial mathematics. The utility of SCP extends beyond its immediate application. It also serves as a springboard for a deeper exploration of fractional processes and for extending existing models that use ordinary Wiener processes to fractional scenarios. In essence, deploying both SCP and fractional processes in financial models provides new, more accurate ways to depict market dynamics.

Keywords: fractional Ornstein-Uhlenbeck process, fractional stochastic processes, Heston model, neural networks, stochastic correlation, stochastic differential equations, stochastic volatility

Procedia PDF Downloads 118
313 Preliminary Study of Gold Nanostars/Enhanced Filter for Keratitis Microorganism Raman Fingerprint Analysis

Authors: Chi-Chang Lin, Jian-Rong Wu, Jiun-Yan Chiu

Abstract:

Myopia, ubiquitous symptom that is necessary to correct the eyesight by optical lens struggles many people for their daily life. Recent years, younger people raise interesting on using contact lens because of its convenience and aesthetics. In clinical, the risk of eye infections increases owing to the behavior of incorrectly using contact lens unsupervised cleaning which raising the infection risk of cornea, named ocular keratitis. In order to overcome the identification needs, new detection or analysis method with rapid and more accurate identification for clinical microorganism is importantly needed. In our study, we take advantage of Raman spectroscopy having unique fingerprint for different functional groups as the distinct and fast examination tool on microorganism. As we know, Raman scatting signals are normally too weak for the detection, especially in biological field. Here, we applied special SERS enhancement substrates to generate higher Raman signals. SERS filter we designed in this article that prepared by deposition of silver nanoparticles directly onto cellulose filter surface and suspension nanoparticles - gold nanostars (AuNSs) also be introduced together to achieve better enhancement for lower concentration analyte (i.e., various bacteria). Research targets also focusing on studying the shape effect of synthetic AuNSs, needle-like surface morphology may possible creates more hot-spot for getting higher SERS enhance ability. We utilized new designed SERS technology to distinguish the bacteria from ocular keratitis under strain level, and specific Raman and SERS fingerprint were grouped under pattern recognition process. We reported a new method combined different SERS substrates can be applied for clinical microorganism detection under strain level with simple, rapid preparation and low cost. Our presenting SERS technology not only shows the great potential for clinical bacteria detection but also can be used for environmental pollution and food safety analysis.

Keywords: bacteria, gold nanostars, Raman spectroscopy surface-enhanced Raman scattering filter

Procedia PDF Downloads 167
312 Glyco-Biosensing as a Novel Tool for Prostate Cancer Early-Stage Diagnosis

Authors: Pavel Damborsky, Martina Zamorova, Jaroslav Katrlik

Abstract:

Prostate cancer is annually the most common newly diagnosed cancer among men. An extensive number of evidence suggests that traditional serum Prostate-specific antigen (PSA) assay still suffers from a lack of sufficient specificity and sensitivity resulting in vast over-diagnosis and overtreatment. Thus, the early-stage detection of prostate cancer (PCa) plays undisputedly a critical role for successful treatment and improved quality of life. Over the last decade, particular altered glycans have been described that are associated with a range of chronic diseases, including cancer and inflammation. These glycans differences enable a distinction to be made between physiological and pathological state and suggest a valuable biosensing tool for diagnosis and follow-up purposes. Aberrant glycosylation is one of the major characteristics of disease progression. Consequently, the aim of this study was to develop a more reliable tool for early-stage PCa diagnosis employing lectins as glyco-recognition elements. Biosensor and biochip technology putting to use lectin-based glyco-profiling is one of the most promising strategies aimed at providing fast and efficient analysis of glycoproteins. The proof-of-concept experiments based on sandwich assay employing anti-PSA antibody and an aptamer as a capture molecules followed by lectin glycoprofiling were performed. We present a lectin-based biosensing assay for glycoprofiling of serum biomarker PSA using different biosensor and biochip platforms such as label-free surface plasmon resonance (SPR) and microarray with fluorescent label. The results suggest significant differences in interaction of particular lectins with PSA. The antibody-based assay is frequently associated with the sensitivity, reproducibility, and cross-reactivity issues. Aptamers provide remarkable advantages over antibodies due to the nucleic acid origin, stability and no glycosylation. All these data are further step for construction of highly selective, sensitive and reliable sensors for early-stage diagnosis. The experimental set-up also holds promise for the development of comparable assays with other glycosylated disease biomarkers.

Keywords: biomarker, glycosylation, lectin, prostate cancer

Procedia PDF Downloads 406
311 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 289
310 The Comparative Study of Attitudes toward Entrepreneurial Intention between ASEAN and Europe: An Analysis Using GEM Data

Authors: Suchart Tripopsakul

Abstract:

This paper uses data from the Global Entrepreneurship Monitor (GEM) to investigate the difference of attitudes towards entrepreneurial intention (EI). EI is generally assumed to be the single most relevant predictor of entrepreneurial behavior. The aim of this paper is to examine a range of attitudes effect on individual’s intent to start a new venture. A cross-cultural comparison between Asia and Europe is used to further investigate the possible differences between potential entrepreneurs from these distinct national contexts. The empirical analysis includes a GEM data set of 10 countries (n = 10,306) which was collected in 2013. Logistic regression is used to investigate the effect of individual’s attitudes on EI. Independent variables include individual’s perceived capabilities, the ability to recognize business opportunities, entrepreneurial network, risk perceptions as well as a range of socio-cultural attitudes. Moreover, a cross-cultural comparison of the model is conducted including six ASEAN (Malaysia, Indonesia, Philippines, Singapore, Vietnam and Thailand) and four European nations (Spain, Sweden, Germany, and the United Kingdom). The findings support the relationship between individual’s attitudes and their entrepreneurial intention. Individual’s capability, opportunity recognition, networks and a range of socio-cultural perceptions all influence EI significantly. The impact of media attention on entrepreneurship and was found to influence EI in ASEAN, but not in Europe. On the one hand, Fear of failure was found to influence EI in Europe, but not in ASEAN. The paper develops and empirically tests attitudes toward Entrepreneurial Intention between ASEAN and Europe. Interestingly, fear of failure was found to have no significant effect in ASEAN, and the impact of media attention on entrepreneurship and was found to influence EI in ASEAN. Moreover, the resistance of ASEAN entrepreneurs to the otherwise high rates of fear of failure and high impact of media attention are proposed as independent variables to explain the relatively high rates of entrepreneurial activity in ASEAN as reported by GEM. The paper utilizes a representative sample of 10,306 individuals in 10 countries. A range of attitudes was found to significantly influence entrepreneurial intention. Many of these perceptions, such as the impact of media attention on entrepreneurship can be manipulated by government policy. The paper also suggests strategies by which Asian economy in particular can benefit from their apparent high impact of media attention on entrepreneurship.

Keywords: an entrepreneurial intention, attitude, GEM, ASEAN and Europe

Procedia PDF Downloads 311
309 The Influence of Alvar Aalto on the Early Work of Álvaro Siza

Authors: Eduardo Jorge Cabral dos Santos Fernandes

Abstract:

The expression ‘Porto School’, usually associated with an educational institution, the School of Fine Arts of Porto, is applied for the first time with the sense of an architectural trend by Nuno Portas in a text published in 1983. The expression is used to characterize a set of works by Porto architects, in which common elements are found, namely the desire to reuse languages and forms of the German and Dutch rationalism of the twenties, using the work of Alvar Aalto as a mediation for the reinterpretation of these models. In the same year, Álvaro Siza classifies the Finnish architect as a miscegenation agent who transforms experienced models and introduces them to different realities in a text published in Jornal de Letras, Artes e Ideias. The influence of foreign models and their adaptation to the context has been a recurrent theme in Portuguese architecture, which finds important contributions in the writings of Alexandre Alves Costa, at this time. However, the identification of these characteristics in Siza’s work is not limited to the Portuguese theoretical production: it is the recognition of this attitude towards the context that leads Kenneth Frampton to include Siza in the restricted group of architects who embody Critical Regionalism (in his book Modern architecture: a critical history). For Frampton, his work focuses on the territory and on the consequences of the intervention in the context, viewing architecture as a tectonic fact rather than a series of scenographic episodes and emphasizing site-specific aspects (topography, light, climate). Therefore, the motto of this paper is the dichotomous opposition between foreign influences and adaptation to the context in the early work of Álvaro Siza (designed in the sixties) in which the influence (theoretical, methodological, and formal) of Alvar Aalto manifests itself in the form and the language: the pool at Quinta da Conceição, the Seaside Pools and the Tea House (three works in Leça da Palmeira) and the Lordelo Cooperative (in Porto). This work is part of a more comprehensive project, which considers several case studies throughout the Portuguese architect's vast career, built in Portugal and abroad, in order to obtain a holistic view.

Keywords: Alvar Aalto, Álvaro Siza, foreign influences, adaptation to the context

Procedia PDF Downloads 30
308 Human Rights in the United States: Challenges and Lessons from the Period 1948-2018

Authors: Mary Carmen Peloche Barrera

Abstract:

Since its early years as an independent nation, the United States has been one of the main promoters regarding the recognition, legislation, and protection of human rights. In the matter of freedom, the founding father Thomas Jefferson envisioned the role of the U.S. as a defender of freedom and equality throughout the world. This founding ideal shaped America’s domestic and foreign policy in the 19th and the 20th century and became an aspiration of the ideals of the country to expand its values and institutions. The history of the emergence of human rights cannot be studied without making reference to leaders such as Woodrow Wilson, Franklin, and Eleanor Roosevelt, as well as Martin Luther King. Throughout its history, this country has proclaimed that the protection of the freedoms of men, both inside and outside its borders, is practically the reason for its existence. Although the United States was one of the first countries to recognize the existence of inalienable rights for individuals, as well as the main promoter of the Universal Declaration of Human Rights of 1948, the country has gone through critical moments that had led to questioning its commitment to the issue. Racial segregation, international military interventions, national security strategy, as well as national legislation on immigration, are some of the most controversial issues related to decisions and actions driven by the United States, which at the same time mismatched with its role as an advocate of human rights, both in the Americas and in the rest of the world. The aim of this paper is to study the swinging of the efforts and commitments of the United States towards human rights. The paper will analyze the history and evolution of human rights in the United States, to study the greatest challenges for the country in this matter. The paper will focus on both the domestic policy (related to demographic issues) and foreign policy (about its role in a post-war world). Currently, more countries are joining the multilateral efforts for the promotion and protection of human rights. At the same time, the United States is one of the least committed countries in this respect, having ratified only 5 of the 18 treaties emanating from the United Nations. The last ratification was carried out in 2002 and, since then, the country has been losing ground, in an increasingly vertiginous way, in its credibility and, even worse, in its role as leader of 'the free world'. With or without the United States, the protection of human rights should remain the main goal of the international community.

Keywords: United States, human rights, foreign policy, domestic policy

Procedia PDF Downloads 117
307 An Adaptive Conversational AI Approach for Self-Learning

Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo

Abstract:

In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.

Keywords: conversational AI, chatbot, dialog management, semantic analysis

Procedia PDF Downloads 136
306 Molecular Diagnosis of a Virus Associated with Red Tip Disease and Its Detection by Non Destructive Sensor in Pineapple (Ananas comosus)

Authors: A. K. Faizah, G. Vadamalai, S. K. Balasundram, W. L. Lim

Abstract:

Pineapple (Ananas comosus) is a common crop in tropical and subtropical areas of the world. Malaysia once ranked as one of the top 3 pineapple producers in the world in the 60's and early 70's, after Hawaii and Brazil. Moreover, government’s recognition of the pineapple crop as one of priority commodities to be developed for the domestics and international markets in the National Agriculture Policy. However, pineapple industry in Malaysia still faces numerous challenges, one of which is the management of disease and pest. Red tip disease on pineapple was first recognized about 20 years ago in a commercial pineapple stand located in Simpang Renggam, Johor, Peninsular Malaysia. Since its discovery, there has been no confirmation on its causal agent of this disease. The epidemiology of red tip disease is still not fully understood. Nevertheless, the disease symptoms and the spread within the field seem to point toward viral infection. Bioassay test on nucleic acid extracted from the red tip-affected pineapple was done on Nicotiana tabacum cv. Coker by rubbing the extracted sap. Localised lesions were observed 3 weeks after inoculation. Negative staining of the fresh inoculated Nicotiana tabacum cv. Coker showed the presence of membrane-bound spherical particles with an average diameter of 94.25nm under transmission electron microscope. The shape and size of the particles were similar to tospovirus. SDS-PAGE analysis of partial purified virions from inoculated N. tabacum produced a strong and a faint protein bands with molecular mass of approximately 29 kDa and 55 kDa. Partial purified virions of symptomatic pineapple leaves from field showed bands with molecular mass of approximately 29 kDa, 39 kDa and 55kDa. These bands may indicate the nucleocapsid protein identity of tospovirus. Furthermore, a handheld sensor, Greenseeker, was used to detect red tip symptoms on pineapple non-destructively based on spectral reflectance, measured as Normalized Difference Vegetation Index (NDVI). Red tip severity was estimated and correlated with NDVI. Linear regression models were calibrated and tested developed in order to estimate red tip disease severity based on NDVI. Results showed a strong positive relationship between red tip disease severity and NDVI (r= 0.84).

Keywords: pineapple, diagnosis, virus, NDVI

Procedia PDF Downloads 791
305 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 331
304 The Role of Accounting and Auditing in Anti-Corruption Strategies: The Case of ECOWAS

Authors: Edna Gnomblerou

Abstract:

Given the current scale of corruption epidemic in West African economies, governments are seeking for immediate and effective measures to reduce the likelihood of the plague within the region. Generally, accountants and auditors are expected to help organizations in detecting illegal practices. However, their role in the fight against corruption is sometimes limited due to the collusive nature of corruption. The Denmark anti-corruption model shows that the implementation of additional controls over public accounts and independent efficient audits improve transparency and increase the probability of detection. This study is aimed at reviewing the existing anti-corruption policies of the Economic Commission of West African States (ECOWAS) as to observe the role attributed to accounting, auditing and other managerial practices in their anti-corruption drive. It further discusses the usefulness of accounting and auditing in helping anti-corruption commissions in controlling misconduct and increasing the perception to detect irregularities within public administration. The purpose of this initiative is to identify and assess the relevance of accounting and auditing in curbing corruption. To meet this purpose, the study was designed to answer the questions of whether accounting and auditing processes were included in the reviewed anti-corruption strategies, and if yes, whether they were effective in the detection process. A descriptive research method was adopted in examining the role of accounting and auditing in West African anti-corruption strategies. The analysis reveals that proper recognition of accounting standards and implementation of financial audits are viewed as strategic mechanisms in tackling corruption. Additionally, codes of conduct, whistle-blowing and information disclosure to the public are among the most common managerial practices used throughout anti-corruption policies to effectively and efficiently address the problem. These observations imply that sound anti-corruption strategies cannot ignore the values of including accounting and auditing processes. On one hand, this suggests that governments should employ all resources possible to improve accounting and auditing practices in the management of public sector organizations. On the other hand, governments must ensure that accounting and auditing practices are not limited to the private sector, but when properly implemented constitute crucial mechanisms to control and reduce corrupt incentives in public sector.

Keywords: accounting, anti-corruption strategy, auditing, ECOWAS

Procedia PDF Downloads 255
303 Place-Based Practice: A New Zealand Rural Nursing Study

Authors: Jean Ross

Abstract:

Rural nursing is not an identified professional identity in the UK, unlike the USA, Canada, and Australia which recognizes rural nursing as a specialty scope of practice. In New Zealand rural nursing is an underrepresented aspect of nursing practice, is misunderstood and does not fit easily within the wider nursing profession and policies governing practice. This study situated within the New Zealand context adds to the international studies’ aligned with rural nursing practice. The study addresses a gap in the literature by striving to identify and strengthen the awareness of and increase rural nurses’ understanding and articulation of their changing and adapting identity and furthermore an opportunity to appreciate their contribution to the delivery of rural health care. In addition, this study adds to the growing global rural nursing knowledge and theoretical base. This research is a continuation of the author’s academic involvement and ongoing relationships with the rural nursing sector, national policy analysts and health care planners since the 1990s. These relationships have led to awareness, that despite rural nurses’ efforts to explain the particular nuances which make up their practice, there has been little recognition by profession to establish rural nursing as a specialty. The research explored why nurses’ who practiced in the rural Otago region of New Zealand, between the 1990s and early 2000s moved away from the traditional identity as a district, practice or public health nurse and looked towards a more appropriate identity which reflected their emerging practice. This qualitative research situated within the interpretive paradigm embeds this retrospective study within the discipline of nursing and engages with the concepts of place and governmentality. National key informant and Otago regional rural nurse interviews generated data and were analyzed using thematic analysis. Stemming from the analyses, an analytical diagrammatic matrix was developed demonstrating rural nursing as a ‘place–based practice’ governed both from within and beyond location presenting how the nurse aligns the self in the rural community as a meaningful provider of health care. Promoting this matrix may encourage a focal discussion point within the international spectrum of nursing and likewise between rural and non-rural nurses which it is hoped will generate further debate in relation to the different nuances aligned with rural nursing practice. Further, insights from this paper may capture key aspects and issues related to identity formation in respect to rural nurses, from the UK, New Zealand, Canada, USA, and Australia.

Keywords: matrix, place, nursing, rural

Procedia PDF Downloads 140