Search results for: handwritten word recognition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2426

Search results for: handwritten word recognition

356 From Oral to Written: Translating the Dawot (Epic Poem), Revitalizing Appreciation for Indigenous Literature

Authors: Genevieve Jorolan-Quintero

Abstract:

The recording as well as the preservation of indigenous literature is an important task as it deals with a significant heritage of pre-colonial culture. The beliefs and traditions of a people are reflected in their oral narratives, such as the folk epic, which must be written down to insure their preservation. The epic poem for instance, known as dawot among the Mandaya, one of the indigenous communities in the southern region of the Philippines, narrates the customs, the ways of life, and the adventures of an ancient people. Nabayra, an expert on Philippine folkloric studies, stresses that still extant after centuries of unknown origin, the dawot was handed down to the magdadawot (bard) by word of mouth, forming the greatest bulk of Mandaya oral tradition. Unhampered by modern means of communication to distract her/him, the magdadawot has a sharp memory of the intricacies of the ancient art of chanting the panayday (verses) of the epic poem. The dawot has several hullubaton (episodes), each of which takes several nights to chant . The language used in these oral traditions is archaic Mandaya, no longer spoken or clearly understood by the present generation. There is urgency to the task of recording and writing down what remain of the epic poem since the singers and storytellers who have retained the memory and the skill of chanting and narrating the dawot and other forms of oral tradition in their original forms are getting fewer. The few who are gifted and skilled to transmit these ancient arts and wisdom are old and dying. Unlike the other Philippine epics (i.e. the Darangen, the Ulahingan, the Hinilawod, etc.), the Mandaya epic is yet to be recognized and given its rightful place among the recorded epics in Philippine Folk Literature. The general aim of this study was to put together and preserve an intangible heritage, the Mandaya hullubaton (episodes of the dawot), in order to preserve and promote appreciation for the oral traditions and cultural legacy of the Mandaya. It was able to record, transcribe, and translate four hullubaton of the folk epic into two languages, Visayan and English to insure understanding of their contents and significance among non-Mandaya audiences. Evident in the contents of the episodes are the cultural practices, ideals, life values, and traditions of the ancient Mandaya. While the conquests and adventures of the Mandaya heroes Lumungtad, Dilam, and Gambong highlight heroic virtues, the role of the Mandaya matriarch in family affairs is likewise stressed. The recording and the translation of the hullubaton and the dawot into commonly spoken languages will not only promote knowledge and understanding about their culture, but will also stimulate in the members of this cultural community a sense of pride for their literature and culture. Knowledge about indigenous cultural system and philosophy derived from their oral literature will serve as a springboard to further comparative researches dealing with indigenous mores and belief systems among the different tribes in the Philippines, in Asia, in Africa, and other countries in the world.

Keywords: Dawot, epic poem, Mandaya, Philippine folk literature

Procedia PDF Downloads 445
355 “It Isn’t a State Problem”: The Minas Conga Mine Controversy and Exemplifying the Need for Binding International Obligations on Corporate Actors

Authors: Cindy Woods

Abstract:

After years of implacable neoliberal globalization, multinational corporations have moved from the periphery to the center of the international legal agenda. Human rights advocates have long called for greater corporate accountability in the international arena. The creation of the Global Compact in 2000, while aimed at fostering greater corporate respect for human rights, did not silence these calls. After multiple unsuccessful attempts to adopt a set of norms relating to the human rights responsibilities of transnational corporations, the United Nations succeeded in 2008 with the Guiding Principles on Business and Human Rights (Guiding Principles). The Guiding Principles, praised by some within the international human rights community for their recognition of an individual corporate responsibility to respect human rights, have not escaped their share of criticism. Many view the Guiding Principles to be toothless, failing to directly impose obligations upon corporations, and call for binding international obligations on corporate entities. After decades of attempting to promulgate human rights obligations for multinational corporations, the existing legal frameworks in place fall short of protecting individuals from the human rights abuses of multinational corporations. The Global Compact and Guiding Principles are proof of the United Nations’ unwillingness to impose international legal obligations on corporate actors. In June 2014, the Human Rights Council adopted a resolution to draft international legally binding human rights norms for business entities; however, key players in the international arena have already announced they will not cooperate with such efforts. This Note, through an overview of the existing corporate accountability frameworks and a study of Newmont Mining’s Minas Conga project in Peru, argues that binding international human rights obligations on corporations are necessary to fully protect human rights. Where states refuse to or simply cannot uphold their duty to protect individuals from transnational businesses’ human rights transgressions, there must exist mechanisms to pursue justice directly against the multinational corporation.

Keywords: business and human rights, Latin America, international treaty on business and human rights, mining, human rights

Procedia PDF Downloads 501
354 Co-produced Databank of Tailored Messages to Support Enagagement to Digitial Health Interventions

Authors: Menna Brown, Tania Domun

Abstract:

Digital health interventions are effective across a wide array of health conditions spanning physical health, lifestyle behaviour change, and mental health and wellbeing; furthermore, they are rapidly increasing in volume within both the academic literature and society as commercial apps continue to proliferate the digital health market. However, adherence and engagement to digital health interventions remains problematic. Technology-based personalised and tailored reminder strategies can support engagement to digital health interventions. Interventions which support individuals’ mental health and wellbeing are of critical importance in the wake if the COVID-19 pandemic. Student and young person’s mental health has been negatively affected and digital resources continue to offer cost effective means to address wellbeing at a population level. Develop a databank of digital co-produced tailored messages to support engagement to a range of digital health interventions including those focused on mental health and wellbeing, and lifestyle behaviour change. Qualitative research design. Participants discussed their views of health and wellbeing, engagement and adherence to digital health interventions focused around a 12-week wellbeing intervention via a series of focus group discussions. They worked together to co-create content following a participatory design approach. Three focus group discussions were facilitated with (n=15) undergraduate students at one Welsh university to provide an empirically derived, co-produced, databank of (n=145) tailored messages. Messages were explored and categorised thematically, and the following ten themes emerged: Autonomy, Recognition, Guidance, Community, Acceptance, Responsibility, Encouragement, Compassion, Impact and Ease. The findings provide empirically derived, co-produced tailored messages. These have been made available for use, via ‘ACTivate your wellbeing’ a digital, automated, 12-week health and wellbeing intervention programme, based on acceptance and commitment therapy (ACT). The purpose of which is to support future research to evaluate the impact of thematically categorised tailored messages on engagement and adherence to digital health interventions.

Keywords: digital health, engagement, wellbeing, participatory design, positive psychology, co-production

Procedia PDF Downloads 121
353 Simulation of Elastic Bodies through Discrete Element Method, Coupled with a Nested Overlapping Grid Fluid Flow Solver

Authors: Paolo Sassi, Jorge Freiria, Gabriel Usera

Abstract:

In this work, a finite volume fluid flow solver is coupled with a discrete element method module for the simulation of the dynamics of free and elastic bodies in interaction with the fluid and between themselves. The open source fluid flow solver, caffa3d.MBRi, includes the capability to work with nested overlapping grids in order to easily refine the grid in the region where the bodies are moving. To do so, it is necessary to implement a recognition function able to identify the specific mesh block in which the device is moving in. The set of overlapping finer grids might be displaced along with the set of bodies being simulated. The interaction between the bodies and the fluid is computed through a two-way coupling. The velocity field of the fluid is first interpolated to determine the drag force on each object. After solving the objects displacements, subject to the elastic bonding among them, the force is applied back onto the fluid through a Gaussian smoothing considering the cells near the position of each object. The fishnet is represented as lumped masses connected by elastic lines. The internal forces are derived from the elasticity of these lines, and the external forces are due to drag, gravity, buoyancy and the load acting on each element of the system. When solving the ordinary differential equations system, that represents the motion of the elastic and flexible bodies, it was found that the Runge Kutta solver of fourth order is the best tool in terms of performance, but requires a finer grid than the fluid solver to make the system converge, which demands greater computing power. The coupled solver is demonstrated by simulating the interaction between the fluid, an elastic fishnet and a set of free bodies being captured by the net as they are dragged by the fluid. The deformation of the net, as well as the wake produced in the fluid stream are well captured by the method, without requiring the fluid solver mesh to adapt for the evolving geometry. Application of the same strategy to the simulation of elastic structures subject to the action of wind is also possible with the method presented, and one such application is currently under development.

Keywords: computational fluid dynamics, discrete element method, fishnets, nested overlapping grids

Procedia PDF Downloads 417
352 Exploring the Rhinoceros Beetles of a Tropical Forest of Eastern Himalayas

Authors: Subhankar Kumar Sarkar

Abstract:

Beetles of the subfamily Dynastinae under the family Scarabaeidae of the insect order Coleoptera are popularly known as ‘Rhinoceros beetles’ because of the characteristic horn borne by the males on their head. These horns are dedicated in mating battle against other males and have evolved as a result of phenotypic plasticity. Scarabaeidae is the largest of all families under Coleoptera and is composed of 11 subfamilies, of which the subfamily Dynastinae is represented by approximately 300 species. Some of these beetles have been reported to cause considerable damage to agriculture and forestry both in their larval and adult stages, while many of them are beneficial as they pollinate plants and recycle plant materials. Eastern Himalayas is regarded as one of the 35 biodiversity hotspot zones of the world and one of the four of India, which is exhibited by its rich and megadiverse tropical forests. However, our knowledge on the faunal diversity of these forests is very limited, particularly for the insect fauna. One such tropical forest of Eastern Himalayas is the ‘Buxa Tiger Reserve’ located between latitudes 26°30” to 26°55” North and Longitudes 89°20” to 89˚35” East of India and occupies an area of about 759.26 square kilometers. It is with this background an attempt has been made to explore the insect fauna of the forest. Insect sampling was carried out in each beat and range of Buxa Tiger Reserve in all the three seasons viz, Premonsoon, Monsoon, and Postmonsoon. Sample collections were done by sweep nets, hand picking technique and pit fall traps. UV light trap was used to collect the nocturnal insects. Morphological examinations of the collected samples were carried out with Stereozoom Binocular Microscopes (Zeiss SV6 and SV11) and were identified up to species level with the aid of relevant literature. Survey of the insect fauna of the forest resulted in the recognition of 76 scarab species, of which 8 belong to the subfamily dealt herein. Each of the 8 species represents a separate genus. The forest is dominated by the members of Xylotrupes gideon (Linnaeus) as is represented by highest number of individuals. The recorded taxa show about 12% endemism and are of mainly oriental in distribution. Premonsoon is the most favorable season for their occurrence and activity followed by Monsoon and Postmonsoon.

Keywords: Dynastinae, Scarabaeidae, diversity, Buxa Tiger Reserve

Procedia PDF Downloads 190
351 A Comprehensive Framework for Fraud Prevention and Customer Feedback Classification in E-Commerce

Authors: Samhita Mummadi, Sree Divya Nagalli, Harshini Vemuri, Saketh Charan Nakka, Sumesh K. J.

Abstract:

One of the most significant challenges faced by people in today’s digital era is an alarming increase in fraudulent activities on online platforms. The fascination with online shopping to avoid long queues in shopping malls, the availability of a variety of products, and home delivery of goods have paved the way for a rapid increase in vast online shopping platforms. This has had a major impact on increasing fraudulent activities as well. This loop of online shopping and transactions has paved the way for fraudulent users to commit fraud. For instance, consider a store that orders thousands of products all at once, but what’s fishy about this is the massive number of items purchased and their transactions turning out to be fraud, leading to a huge loss for the seller. Considering scenarios like these underscores the urgent need to introduce machine learning approaches to combat fraud in online shopping. By leveraging robust algorithms, namely KNN, Decision Trees, and Random Forest, which are highly effective in generating accurate results, this research endeavors to discern patterns indicative of fraudulent behavior within transactional data. Introducing a comprehensive solution to this problem in order to empower e-commerce administrators in timely fraud detection and prevention is the primary motive and the main focus. In addition to that, sentiment analysis is harnessed in the model so that the e-commerce admin can tailor to the customer’s and consumer’s concerns, feedback, and comments, allowing the admin to improve the user’s experience. The ultimate objective of this study is to ramp up online shopping platforms against fraud and ensure a safer shopping experience. This paper underscores a model accuracy of 84%. All the findings and observations that were noted during our work lay the groundwork for future advancements in the development of more resilient and adaptive fraud detection systems, which will become crucial as technologies continue to evolve.

Keywords: behavior analysis, feature selection, Fraudulent pattern recognition, imbalanced classification, transactional anomalies

Procedia PDF Downloads 32
350 Impact of Sufism on Indian Cinema: A New Cultural Construct for Mediating Conflict

Authors: Ravi Chaturvedi, Ghanshyam Beniwal

Abstract:

Without going much into the detail of long history of Sufism in the world and the etymological definition of the word ‘Sufi’, it will be sufficient to underline that the concept of Sufism was to focus the mystic power on the spiritual dimension of Islam with a view-shielding the believers from the outwardly and unrealistic dogma of the faith. Sufis adopted rather a liberal view in propagating the religious order of Islam suitable to the cultural and social environment of the land. It is, in fact, a mission of higher religious order of any faith, which disdains strife and conflict in any form. The joy of self-realization being the essence of religion is experienced after a long spiritual practice. India had Sufi and Bhakti (devotion) traditions in Islam and Hinduism, respectively. Both Sufism and Bhakti traditions were based on respect for different religions. The poorer and lower caste Hindus and Muslims were greatly influenced by these traditions. Unlike Ulemas and Brahmans, the Sufi and Bhakti saints were highly tolerant and open to the truth in other faiths. They never adopted sectarian attitudes and were never involved in power struggles. They kept away from power structures. Sufism is integrated with the Indian cinema since its initial days. In the earliest Bollywood movies, Sufism was represented in the form of qawwali which made its way from dargahs (shrines). Mixing it with pop influences, Hindi movies began using Sufi music in a big way only in the current decade. However, of late, songs with Sufi influences have become de rigueur in almost every film being released these days, irrespective of the genre, whether it is a romantic Gangster or a cerebral Corporate. 'Sufi is in the DNA of the Indian sub-continent', according to several contemporary filmmakers, critics, and spectators.The inherent theatricality motivates the performer of the 'Sufi' rituals for a dramatic behavior. The theatrical force of these stages of Sufi practice is so powerful that even the spectator cannot resist himself from being moved. In a multi-cultural country like India, the mediating streams have acquired a multi-layered importance in recent history. The second half of Indian post-colonial era has witnessed a regular chain of some conflicting religio-political waves arising from various sectarian camps in the country, which have compelled the counter forces to activate for keeping the spirit of composite cultural ethos alive. The study has revealed that the Sufi practice methodology is also being adapted for inclusion of spirituality in life at par to Yoga practice. This paper, a part of research study, is an attempt to establish that the Sufism in Indian cinema is one such mediating voice which is very active and alive throughout the length and width of the country continuously bridging the gap between various religious and social factions, and have a significant role to play in future as well.

Keywords: Indian cinema, mediating voice, Sufi, yoga practice

Procedia PDF Downloads 497
349 Adopting Data Science and Citizen Science to Explore the Development of African Indigenous Agricultural Knowledge Platform

Authors: Steven Sam, Ximena Schmidt, Hugh Dickinson, Jens Jensen

Abstract:

The goal of this study is to explore the potential of data science and citizen science approaches to develop an interactive, digital, open infrastructure that pulls together African indigenous agriculture and food systems data from multiple sources, making it accessible and reusable for policy, research and practice in modern food production efforts. The World Bank has recognised that African Indigenous Knowledge (AIK) is innovative and unique among local and subsistent smallholder farmers, and it is central to sustainable food production and enhancing biodiversity and natural resources in many poor, rural societies. AIK refers to tacit knowledge held in different languages, cultures and skills passed down from generation to generation by word of mouth. AIK is a key driver of food production, preservation, and consumption for more than 80% of citizens in Africa, and can therefore assist modern efforts of reducing food insecurity and hunger. However, the documentation and dissemination of AIK remain a big challenge confronting librarians and other information professionals in Africa, and there is a risk of losing AIK owing to urban migration, modernisation, land grabbing, and the emergence of relatively small-scale commercial farming businesses. There is also a clear disconnect between the AIK and scientific knowledge and modern efforts for sustainable food production. The study combines data science and citizen science approaches through active community participation to generate and share AIK for facilitating learning and promoting knowledge that is relevant for policy intervention and sustainable food production through a curated digital platform based on FAIR principles. The study adopts key informant interviews along with participatory photo and video elicitation approach, where farmers are given digital devices (mobile phones) to record and document their every practice involving agriculture, food production, processing, and consumption by traditional means. Data collected are analysed using the UK Science and Technology Facilities Council’s proven methodology of citizen science (Zooniverse) and data science. Outcomes are presented in participatory stakeholder workshops, where the researchers outline plans for creating the platform and developing the knowledge sharing standard framework and copyrights agreement. Overall, the study shows that learning from AIK, by investigating what local communities know and have, can improve understanding of food production and consumption, in particular in times of stress or shocks affecting the food systems and communities. Thus, the platform can be useful for local populations, research, and policy-makers, and it could lead to transformative innovation in the food system, creating a fundamental shift in the way the North supports sustainable, modern food production efforts in Africa.

Keywords: Africa indigenous agriculture knowledge, citizen science, data science, sustainable food production, traditional food system

Procedia PDF Downloads 83
348 Deploying a Transformative Learning Model in Technological University Dublin to Assess Transversal Skills

Authors: Sandra Thompson, Paul Dervan

Abstract:

Ireland’s first Technological University (TU Dublin) was established on 1st January 2019, and its creation is an exciting new milestone in Irish Higher Education. TU Dublin is now Ireland’s biggest University supporting 29,000 students across three campuses with 3,500 staff. The University aspires to create work-ready graduates who are socially responsible, open-minded global thinkers who are ambitious to change the world for the better. As graduates, they will be enterprising and daring in all their endeavors, ready to play their part in transforming the future. Feedback from Irish employers and students coupled with evidence from other authoritative sources such as the World Economic Forum points to a need for greater focus on the development of students’ employability skills as they prepare for today’s work environment. Moreover, with an increased focus on Universal Design for Learning (UDL) and inclusiveness, there is recognition that students are more than a numeric grade value. Robust grading systems have been developed to track a student’s performance around discipline knowledge but there is little or no global consensus on a definition of transversal skills nor on a unified framework to assess transversal skills. Education and industry sectors are often assessing one or two skills, and some are developing their own frameworks to capture the learner’s achievement in this area. Technological University Dublin (TU Dublin) have discovered and implemented a framework to allow students to develop, assess and record their transversal skills using transformative learning theory. The model implemented is an adaptation of Student Transformative Learning Record - STLR which originated in the University of Central Oklahoma (UCO). The purpose of this paper therefore, is to examine the views of students, staff and employers in the context of deploying a Transformative Learning model within the University to assess transversal skills. It will examine the initial impact the transformative learning model is having socially, personally and on the University as an organization. Crucially also, to identify lessons learned from the deployment in order to assist other Universities and Higher Education Institutes who may be considering a focused adoption of Transformative Learning to meet the challenge of preparing students for today’s work environment.

Keywords: assessing transversal skills, higher education, transformative learning, students

Procedia PDF Downloads 130
347 “Everything, Everywhere, All at Once” Hollywoodization and Lack of Authenticity in Today’s Mainstream Cinema

Authors: Haniyeh Parhizkar

Abstract:

When Sarris came up with the "auteur theory" in 1962, he emphasized that the utmost premise of auteur theory is the inner meanings and concepts of a film and that a film is purely an art form. Today's mainstream movies are conceptually closer to what the Frankfurt School scholars regarded as "reproduced" and "mass culture" years ago. Hollywood goes on to be a huge movie-making machine that leads the dominant paradigms of films throughout the world and cinema is far from art. Although there are still movies, directors, and audiences who favor art cinema over Hollywood and mainstream movies, it's an almost undeniable fact that, for the most part, people's perception of movies is widely influenced by their American depiction and Hollywood's legacy of mass culture. With the uprising of Hollywood studios as the forerunners of the movie industry and cinema being largely dependent on economics rather than artistic values, this distinctive role of cinema has diminished and is replaced with a global standard. The Blockbuster 2022 film, 'Everything, Everywhere, All at Once' is now the most-awarded movie of all time, winning seven Oscars at the 95th Academy Awards. Despite its main cast being Asian, the movie is produced by American incorporation and is heavily influenced by Hollywood's dominant themes of superheroes, fantasy, action, and adventure. The New Yorker film critic, Richard Brody, called the movie "a pitch for a Marvel" and critiqued the film for being "universalized" and "empty of history and culture". Other critics of Variety pinpointed the movie's similarities to Marvel, particularly in their storylines of multi-universe which manifest traces of American legacy. As argued by these critics, 'Everything, Everywhere, All at Once' might appear as a unique and authentic film at first glance, but it can be argued that it is yet another version of a Marvel movie. While the movie's universal acclaim was regarded as recognition and an acknowledgment of its Asian cast, the issue that arises here is when the Hollywood influences and American themes are so robust in the film, is the movie industry honoring another culture or is it yet another celebration of Hollywood's dominant paradigm. This essay will employ a critical approach to Hollywood's dominance and mass-produced culture, which has deprived authenticity of non-American movies and is constantly reproducing the same formula of success.

Keywords: hollywoodization, universalization, blockbuster, dominant paradigm, marvel, authenticity, diversity

Procedia PDF Downloads 90
346 A Semantic and Concise Structure to Represent Human Actions

Authors: Tobias Strübing, Fatemeh Ziaeetabar

Abstract:

Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.

Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis

Procedia PDF Downloads 126
345 DNA Nano Wires: A Charge Transfer Approach

Authors: S. Behnia, S. Fathizadeh, A. Akhshani

Abstract:

In the recent decades, DNA has increasingly interested in the potential technological applications that not directly related to the coding for functional proteins that is the expressed in form of genetic information. One of the most interesting applications of DNA is related to the construction of nanostructures of high complexity, design of functional nanostructures in nanoelectronical devices, nanosensors and nanocercuits. In this field, DNA is of fundamental interest to the development of DNA-based molecular technologies, as it possesses ideal structural and molecular recognition properties for use in self-assembling nanodevices with a definite molecular architecture. Also, the robust, one-dimensional flexible structure of DNA can be used to design electronic devices, serving as a wire, transistor switch, or rectifier depending on its electronic properties. In order to understand the mechanism of the charge transport along DNA sequences, numerous studies have been carried out. In this regard, conductivity properties of DNA molecule could be investigated in a simple, but chemically specific approach that is intimately related to the Su-Schrieffer-Heeger (SSH) model. In SSH model, the non-diagonal matrix element dependence on intersite displacements is considered. In this approach, the coupling between the charge and lattice deformation is along the helix. This model is a tight-binding linear nanoscale chain established to describe conductivity phenomena in doped polyethylene. It is based on the assumption of a classical harmonic interaction between sites, which is linearly coupled to a tight-binding Hamiltonian. In this work, the Hamiltonian and corresponding motion equations are nonlinear and have high sensitivity to initial conditions. Then, we have tried to move toward the nonlinear dynamics and phase space analysis. Nonlinear dynamics and chaos theory, regardless of any approximation, could open new horizons to understand the conductivity mechanism in DNA. For a detailed study, we have tried to study the current flowing in DNA and investigated the characteristic I-V diagram. As a result, It is shown that there are the (quasi-) ohmic areas in I-V diagram. On the other hand, the regions with a negative differential resistance (NDR) are detectable in diagram.

Keywords: DNA conductivity, Landauer resistance, negative di erential resistance, Chaos theory, mean Lyapunov exponent

Procedia PDF Downloads 426
344 Bank Liquidity Creation in a Dual Banking System: An Empirical Investigation

Authors: Lianne M. Q. Lee, Mohammed Sharaf Shaiban

Abstract:

The importance of bank liquidity management took center stage as policy makers promoted a more resilient global banking system after the market turmoil of 2007. The growing recognition of Islamic banks’ function of intermediating funds in the economy warrants the need to investigate its balance sheet structure which is distinct from its conventional counterparts. Given that asymmetric risk, transformation is inevitable; Islamic banks need to identify the liquidity risk within their distinctive balance sheet structure. Thus, there is a strong need to quantify and assess the liquidity position to ensure proper functioning of a financial institution. It is vital to measure bank liquidity because liquid banks face less liquidity risk. We examine this issue by using two alternative quantitative measures of liquidity creation “cat fat” and “cat nonfat” constructed by Berger and Bouwman (2009). “Cat fat” measures all on balance sheet items including off balance sheet, whilst the latter measures only on balance sheet items. Liquidity creation is measured over the period 2007-2014 in 14 countries where Islamic and conventional commercial banks coexist. Also, separately by bank size class as empirical studies have shown that liquidity creation varies by bank size. An interesting and important finding shows that all size class of Islamic banks, on average have increased creation of aggregate liquidity in real dollar terms over the years for both liquidity creation measures especially for large banks indicating that Islamic banks actually generates more liquidity to the economy compared to its conventional counterparts, including from off-balance sheet items. The liquidity creation for off-balance sheets by conventional banks may have been affected by the global financial crisis when derivatives markets were severely hit. The results also suggest that Islamic banks have the higher volume of assets and deposits and that borrowing/issues of bonds are less in Islamic banks compared to conventional banks because most products are interest-based. As Islamic banks appear to create more liquidity than conventional banks under both measures, it translates that the development of Islamic banking is significant over the decades since its inception. This finding is encouraging as, despite Islamic banking’s overall size, it represents growth opportunities for these countries.

Keywords: financial institution, liquidity creation, liquidity risk, policy and regulation

Procedia PDF Downloads 353
343 Turn-Taking and Leading Roles in Early Cognition: Interaction of Social Cognition and Language in Development

Authors: Zsuzsanna Schnell, Francesca Ervas

Abstract:

Background: Our study aims to clarify how language fosters further cognitive development and how we eventually arrive at the complex human specific skill of pragmatic competence and reveal what levels of mentalization and theory of mind are in place before language. Method: Our experimental pragmatic investigation maps the interaction of mentalization and pragmatic competence. We map the different levels of mentalization that empower different levels of pragmatic meaning construction and evaluate the results with statistical analysis (MannWhitney and ANOVA). Analyzing the comprehension of literal and non-compositional (figurative) utterances, we apply linguistic trials, among them metaphor-, irony-, irony with surface cue-, humor- and the recognition of maxim infringements trial in neurotypical (NT) preschoolers with a coherent and comparative methodology. Results: The findings reveal the relationship and direction of interaction between Language and theory of mind. On the one hand social-cognitive skills enhance, facilitate and provide a basis for language acquisition, and in return linguistic structures (DeVilliers 2000, 2007) provide a framework for further development of mentalizing skills. Conclusions: Findings confirm that this scaffolding becomes a mutually supportive system where language and social cognition develops in interaction. Certain stages in ToM development serve as a precursor of understanding grammatically complex sentences, like embedded phrases which mirror embedded mental states; which, in turn, facilitates the development of pragmatic competence, thus, the social use of language, integrating social, cognitive, linguistic and psychological factors in discourse. Future implications: Our investigation functions as a differential-diagnostic measure, with typically developing results thus serve as a baseline in further empirical research for atypical cases. This enables the study of populations where language and ToM development is disturbed, reveals how language and ToM are acquired and interact, and gives an insight into what this has to do with clinical symptoms. This in turn can reveal the causal link to the syndrome at hand, which can set directions for therapeutic development and training.

Keywords: theory of mind, language development, mentalization, language philosophy, experimental pragmatics

Procedia PDF Downloads 32
342 Gestalt in Music and Brain: A Non-Linear Chaos Based Study with Detrended/Adaptive Fractal Analysis

Authors: Shankha Sanyal, Archi Banerjee, Sayan Biswas, Sourya Sengupta, Sayan Nag, Ranjan Sengupta, Dipak Ghosh

Abstract:

The term ‘gestalt’ has been widely used in the field of psychology which defined the perception of human mind to group any object not in part but as a 'unified' whole. Music, in general, is polyphonic - i.e. a combination of a number of pure tones (frequencies) mixed together in a manner that sounds harmonious. The study of human brain response due to different frequency groups of the acoustic signal can give us an excellent insight regarding the neural and functional architecture of brain functions. Hence, the study of music cognition using neuro-biosensors is becoming a rapidly emerging field of research. In this work, we have tried to analyze the effect of different frequency bands of music on the various frequency rhythms of human brain obtained from EEG data. Four widely popular Rabindrasangeet clips were subjected to Wavelet Transform method for extracting five resonant frequency bands from the original music signal. These frequency bands were initially analyzed with Detrended/Adaptive Fractal analysis (DFA/AFA) methods. A listening test was conducted on a pool of 100 respondents to assess the frequency band in which the music becomes non-recognizable. Next, these resonant frequency bands were presented to 20 subjects as auditory stimulus and EEG signals recorded simultaneously in 19 different locations of the brain. The recorded EEG signals were noise cleaned and subjected again to DFA/AFA technique on the alpha, theta and gamma frequency range. Thus, we obtained the scaling exponents from the two methods in alpha, theta and gamma EEG rhythms corresponding to different frequency bands of music. From the analysis of music signal, it is seen that loss of recognition is proportional to the loss of long range correlation in the signal. From the EEG signal analysis, we obtain frequency specific arousal based response in different lobes of brain as well as in specific EEG bands corresponding to musical stimuli. In this way, we look to identify a specific frequency band beyond which the music becomes non-recognizable and below which in spite of the absence of other bands the music is perceivable to the audience. This revelation can be of immense importance when it comes to the field of cognitive music therapy and researchers of creativity.

Keywords: AFA, DFA, EEG, gestalt in music, Hurst exponent

Procedia PDF Downloads 332
341 Task Based Functional Connectivity within Reward Network in Food Image Viewing Paradigm Using Functional MRI

Authors: Preetham Shankapal, Jill King, Kori Murray, Corby Martin, Paula Giselman, Jason Hicks, Owen Carmicheal

Abstract:

Activation of reward and satiety networks in the brain while processing palatable food cues, as well as functional connectivity during rest has been studied using functional Magnetic Resonance Imaging of the brain in various obesity phenotypes. However, functional connectivity within the reward and satiety network during food cue processing is understudied. 14 obese individuals underwent two fMRI scans during viewing of Macronutrient Picture System images. Each scan included two blocks of images of High Sugar/High Fat (HSHF), High Carbohydrate/High Fat (HCHF), Low Sugar/Low Fat (LSLF) and also non-food images. Seed voxels within seven food reward relevant ROIs: Insula, putamen and cingulate, precentral, parahippocampal, medial frontal and superior temporal gyri were isolated based on a prior meta-analysis. Beta series correlation for task-related functional connectivity between these seed voxels and the rest of the brain was computed. Voxel-level differences in functional connectivity were calculated between: first and the second scan; individuals who saw novel (N=7) vs. Repeated (N=7) images in the second scan; and between the HC/HF, HSHF blocks vs LSLF and non-food blocks. Computations and analysis showed that during food image viewing, reward network ROIs showed significant functional connectivity with each other and with other regions responsible for attentional and motor control, including inferior parietal lobe and precentral gyrus. These functional connectivity values were heightened among individuals who viewed novel HS/HF images in the second scan. In the second scan session, functional connectivity was reduced within the reward network but increased within attention, memory and recognition regions, suggesting habituation to reward properties and increased recollection of previously viewed images. In conclusion it can be inferred that Functional Connectivity within reward network and between reward and other brain regions, varies by important experimental conditions during food photography viewing, including habituation to shown foods.

Keywords: fMRI, functional connectivity, task-based, beta series correlation

Procedia PDF Downloads 273
340 The Impact of the Application of Blockchain Technology in Accounting and Auditing

Authors: Yusuf Adebayo Oduwole

Abstract:

The evaluation of blockchain technology's potential effects on the accounting and auditing fields is the main objective of this essay. It also adds to the existing body of work by examining how these practices alter technological concerns, including cryptocurrency accounting, regulation, governance, accounting practices, and technical challenges. Examples of this advancement include the growth of the concept of blockchain and its application in accounting. This technology is being considered one of the digital revolutions that could disrupt the world and civilization as it can transfer large volumes of virtual currencies like cryptocurrencies with the help of a third party. The basis for this research is a systematic review of the articles using Vosviewer to display and reflect on the bibliometric information of the articles accessible on the Scopus database. Also, as the practice of using blockchain technology in the field of accounting and auditing is still in its infancy, it may be useful to carry out a more thorough analysis of any implications for accounting and auditing regarding aspects of governance, regulation, and cryptocurrency that have not yet been discussed or addressed to any significant extent. The main findings on the relationship between blockchain and accounting show that the application of smart contracts, such as triple-entry accounting, has increased the quality of accounting records as well as reliance on the information available. This results in fewer cyclical assignments, no need for resolution, and real-time accounting, among others. Thereby, to integrate blockchain through a computer system, one must continuously learn and remain naive when using blockchain-integrated accounting software. This includes learning about how cryptocurrencies are accounted for and regulated. In this study, three original and contributed efforts are presented. To offer a transparent view of the state of previous relevant studies and research works in accounting and auditing that focus on blockchain, it begins by using bibliographic visibility analysis and a Scopus narrative analysis. Second, it highlights legislative, governance, and ethical concerns, such as education, where it tackles the use of blockchain in accounting and auditing. Lastly, it examines the impact of blockchain technologies on the accounting recognition of cryptocurrencies. Users of the technology should, therefore, take their time and learn how it works, as well as keep abreast of the different developments. In addition, the accounting industry must integrate blockchain certification and practice, most likely offline or as part of university education for those intending to become auditors or accountants.

Keywords: blockchain, crypto assets, governance, regulation & smart contracts

Procedia PDF Downloads 30
339 Interior Architecture in the Anthropocene: Engaging the Subnature through the Intensification of Body-Surface Interaction

Authors: Verarisa Ujung

Abstract:

The Anthropocene – as scientists define as a new geological epoch where human intervention has the dominant influence on the geological, atmospheric, and ecological processes challenges the contemporary discourse in architecture and interior. The dominant influence characterises the incapability to distinguish the notion of nature, subnature, human and non-human. Consequently, living in the Anthropocene demands sensitivity and responsiveness to heighten our sense of the rhythm of transformation and recognition of our environment as a product of natural, social and historical processes. The notion of subnature is particularly emphasised in this paper to investigate the poetic sense of living with subnature. It could be associated with the critical tool for exploring the aesthetic and programmatic implications of subnature on interiority. The ephemeral immaterial attached to subnature promotes the sense of atmospheric delineation of interiority, the very inner significance of body-surface interaction, which central to interior architecture discourse. This would then reflect human’s activities; examine the transformative change, the architectural motion and the traces that left between moments. In this way, engaging the notion of subnature enable us to better understand the critical subject on interiority and might provide an in-depth study on interior architecture. Incorporating the exploration on the form, materiality, and pattern of subnature, this research seeks to grasp the inner significance of micro to macro approaches so that the future of interior might be compelled to depend more on the investigation and development of responsive environment. To reflect upon the form, materiality and intensity of subnature that specifically characterized by the natural, social and historical processes, this research examines a volcanic land, White Island/Whakaari, New Zealand as the chosen site of investigation. Emitting various forms and intensities of subnatures - smokes, mud, sulphur gas, this volcanic land also open to the new inhabitation within the sulphur factory ruins that reflects human’s past occupation. In this way, temporal and natural selected manifestations of materiality, artefact, and performance can be traced out and might reveal the meaningful relations among space, inhabitation, and well-being of inhabitants in the Anthropocene.

Keywords: anthropocene, body, intensification, intensity, interior architecture, subnature, surface

Procedia PDF Downloads 176
338 Fast Estimation of Fractional Process Parameters in Rough Financial Models Using Artificial Intelligence

Authors: Dávid Kovács, Bálint Csanády, Dániel Boros, Iván Ivkovic, Lóránt Nagy, Dalma Tóth-Lakits, László Márkus, András Lukács

Abstract:

The modeling practice of financial instruments has seen significant change over the last decade due to the recognition of time-dependent and stochastically changing correlations among the market prices or the prices and market characteristics. To represent this phenomenon, the Stochastic Correlation Process (SCP) has come to the fore in the joint modeling of prices, offering a more nuanced description of their interdependence. This approach has allowed for the attainment of realistic tail dependencies, highlighting that prices tend to synchronize more during intense or volatile trading periods, resulting in stronger correlations. Evidence in statistical literature suggests that, similarly to the volatility, the SCP of certain stock prices follows rough paths, which can be described using fractional differential equations. However, estimating parameters for these equations often involves complex and computation-intensive algorithms, creating a necessity for alternative solutions. In this regard, the Fractional Ornstein-Uhlenbeck (fOU) process from the family of fractional processes offers a promising path. We can effectively describe the rough SCP by utilizing certain transformations of the fOU. We employed neural networks to understand the behavior of these processes. We had to develop a fast algorithm to generate a valid and suitably large sample from the appropriate process to train the network. With an extensive training set, the neural network can estimate the process parameters accurately and efficiently. Although the initial focus was the fOU, the resulting model displayed broader applicability, thus paving the way for further investigation of other processes in the realm of financial mathematics. The utility of SCP extends beyond its immediate application. It also serves as a springboard for a deeper exploration of fractional processes and for extending existing models that use ordinary Wiener processes to fractional scenarios. In essence, deploying both SCP and fractional processes in financial models provides new, more accurate ways to depict market dynamics.

Keywords: fractional Ornstein-Uhlenbeck process, fractional stochastic processes, Heston model, neural networks, stochastic correlation, stochastic differential equations, stochastic volatility

Procedia PDF Downloads 120
337 Preliminary Study of Gold Nanostars/Enhanced Filter for Keratitis Microorganism Raman Fingerprint Analysis

Authors: Chi-Chang Lin, Jian-Rong Wu, Jiun-Yan Chiu

Abstract:

Myopia, ubiquitous symptom that is necessary to correct the eyesight by optical lens struggles many people for their daily life. Recent years, younger people raise interesting on using contact lens because of its convenience and aesthetics. In clinical, the risk of eye infections increases owing to the behavior of incorrectly using contact lens unsupervised cleaning which raising the infection risk of cornea, named ocular keratitis. In order to overcome the identification needs, new detection or analysis method with rapid and more accurate identification for clinical microorganism is importantly needed. In our study, we take advantage of Raman spectroscopy having unique fingerprint for different functional groups as the distinct and fast examination tool on microorganism. As we know, Raman scatting signals are normally too weak for the detection, especially in biological field. Here, we applied special SERS enhancement substrates to generate higher Raman signals. SERS filter we designed in this article that prepared by deposition of silver nanoparticles directly onto cellulose filter surface and suspension nanoparticles - gold nanostars (AuNSs) also be introduced together to achieve better enhancement for lower concentration analyte (i.e., various bacteria). Research targets also focusing on studying the shape effect of synthetic AuNSs, needle-like surface morphology may possible creates more hot-spot for getting higher SERS enhance ability. We utilized new designed SERS technology to distinguish the bacteria from ocular keratitis under strain level, and specific Raman and SERS fingerprint were grouped under pattern recognition process. We reported a new method combined different SERS substrates can be applied for clinical microorganism detection under strain level with simple, rapid preparation and low cost. Our presenting SERS technology not only shows the great potential for clinical bacteria detection but also can be used for environmental pollution and food safety analysis.

Keywords: bacteria, gold nanostars, Raman spectroscopy surface-enhanced Raman scattering filter

Procedia PDF Downloads 169
336 Glyco-Biosensing as a Novel Tool for Prostate Cancer Early-Stage Diagnosis

Authors: Pavel Damborsky, Martina Zamorova, Jaroslav Katrlik

Abstract:

Prostate cancer is annually the most common newly diagnosed cancer among men. An extensive number of evidence suggests that traditional serum Prostate-specific antigen (PSA) assay still suffers from a lack of sufficient specificity and sensitivity resulting in vast over-diagnosis and overtreatment. Thus, the early-stage detection of prostate cancer (PCa) plays undisputedly a critical role for successful treatment and improved quality of life. Over the last decade, particular altered glycans have been described that are associated with a range of chronic diseases, including cancer and inflammation. These glycans differences enable a distinction to be made between physiological and pathological state and suggest a valuable biosensing tool for diagnosis and follow-up purposes. Aberrant glycosylation is one of the major characteristics of disease progression. Consequently, the aim of this study was to develop a more reliable tool for early-stage PCa diagnosis employing lectins as glyco-recognition elements. Biosensor and biochip technology putting to use lectin-based glyco-profiling is one of the most promising strategies aimed at providing fast and efficient analysis of glycoproteins. The proof-of-concept experiments based on sandwich assay employing anti-PSA antibody and an aptamer as a capture molecules followed by lectin glycoprofiling were performed. We present a lectin-based biosensing assay for glycoprofiling of serum biomarker PSA using different biosensor and biochip platforms such as label-free surface plasmon resonance (SPR) and microarray with fluorescent label. The results suggest significant differences in interaction of particular lectins with PSA. The antibody-based assay is frequently associated with the sensitivity, reproducibility, and cross-reactivity issues. Aptamers provide remarkable advantages over antibodies due to the nucleic acid origin, stability and no glycosylation. All these data are further step for construction of highly selective, sensitive and reliable sensors for early-stage diagnosis. The experimental set-up also holds promise for the development of comparable assays with other glycosylated disease biomarkers.

Keywords: biomarker, glycosylation, lectin, prostate cancer

Procedia PDF Downloads 407
335 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 290
334 The Comparative Study of Attitudes toward Entrepreneurial Intention between ASEAN and Europe: An Analysis Using GEM Data

Authors: Suchart Tripopsakul

Abstract:

This paper uses data from the Global Entrepreneurship Monitor (GEM) to investigate the difference of attitudes towards entrepreneurial intention (EI). EI is generally assumed to be the single most relevant predictor of entrepreneurial behavior. The aim of this paper is to examine a range of attitudes effect on individual’s intent to start a new venture. A cross-cultural comparison between Asia and Europe is used to further investigate the possible differences between potential entrepreneurs from these distinct national contexts. The empirical analysis includes a GEM data set of 10 countries (n = 10,306) which was collected in 2013. Logistic regression is used to investigate the effect of individual’s attitudes on EI. Independent variables include individual’s perceived capabilities, the ability to recognize business opportunities, entrepreneurial network, risk perceptions as well as a range of socio-cultural attitudes. Moreover, a cross-cultural comparison of the model is conducted including six ASEAN (Malaysia, Indonesia, Philippines, Singapore, Vietnam and Thailand) and four European nations (Spain, Sweden, Germany, and the United Kingdom). The findings support the relationship between individual’s attitudes and their entrepreneurial intention. Individual’s capability, opportunity recognition, networks and a range of socio-cultural perceptions all influence EI significantly. The impact of media attention on entrepreneurship and was found to influence EI in ASEAN, but not in Europe. On the one hand, Fear of failure was found to influence EI in Europe, but not in ASEAN. The paper develops and empirically tests attitudes toward Entrepreneurial Intention between ASEAN and Europe. Interestingly, fear of failure was found to have no significant effect in ASEAN, and the impact of media attention on entrepreneurship and was found to influence EI in ASEAN. Moreover, the resistance of ASEAN entrepreneurs to the otherwise high rates of fear of failure and high impact of media attention are proposed as independent variables to explain the relatively high rates of entrepreneurial activity in ASEAN as reported by GEM. The paper utilizes a representative sample of 10,306 individuals in 10 countries. A range of attitudes was found to significantly influence entrepreneurial intention. Many of these perceptions, such as the impact of media attention on entrepreneurship can be manipulated by government policy. The paper also suggests strategies by which Asian economy in particular can benefit from their apparent high impact of media attention on entrepreneurship.

Keywords: an entrepreneurial intention, attitude, GEM, ASEAN and Europe

Procedia PDF Downloads 314
333 The Influence of Alvar Aalto on the Early Work of Álvaro Siza

Authors: Eduardo Jorge Cabral dos Santos Fernandes

Abstract:

The expression ‘Porto School’, usually associated with an educational institution, the School of Fine Arts of Porto, is applied for the first time with the sense of an architectural trend by Nuno Portas in a text published in 1983. The expression is used to characterize a set of works by Porto architects, in which common elements are found, namely the desire to reuse languages and forms of the German and Dutch rationalism of the twenties, using the work of Alvar Aalto as a mediation for the reinterpretation of these models. In the same year, Álvaro Siza classifies the Finnish architect as a miscegenation agent who transforms experienced models and introduces them to different realities in a text published in Jornal de Letras, Artes e Ideias. The influence of foreign models and their adaptation to the context has been a recurrent theme in Portuguese architecture, which finds important contributions in the writings of Alexandre Alves Costa, at this time. However, the identification of these characteristics in Siza’s work is not limited to the Portuguese theoretical production: it is the recognition of this attitude towards the context that leads Kenneth Frampton to include Siza in the restricted group of architects who embody Critical Regionalism (in his book Modern architecture: a critical history). For Frampton, his work focuses on the territory and on the consequences of the intervention in the context, viewing architecture as a tectonic fact rather than a series of scenographic episodes and emphasizing site-specific aspects (topography, light, climate). Therefore, the motto of this paper is the dichotomous opposition between foreign influences and adaptation to the context in the early work of Álvaro Siza (designed in the sixties) in which the influence (theoretical, methodological, and formal) of Alvar Aalto manifests itself in the form and the language: the pool at Quinta da Conceição, the Seaside Pools and the Tea House (three works in Leça da Palmeira) and the Lordelo Cooperative (in Porto). This work is part of a more comprehensive project, which considers several case studies throughout the Portuguese architect's vast career, built in Portugal and abroad, in order to obtain a holistic view.

Keywords: Alvar Aalto, Álvaro Siza, foreign influences, adaptation to the context

Procedia PDF Downloads 37
332 Human Rights in the United States: Challenges and Lessons from the Period 1948-2018

Authors: Mary Carmen Peloche Barrera

Abstract:

Since its early years as an independent nation, the United States has been one of the main promoters regarding the recognition, legislation, and protection of human rights. In the matter of freedom, the founding father Thomas Jefferson envisioned the role of the U.S. as a defender of freedom and equality throughout the world. This founding ideal shaped America’s domestic and foreign policy in the 19th and the 20th century and became an aspiration of the ideals of the country to expand its values and institutions. The history of the emergence of human rights cannot be studied without making reference to leaders such as Woodrow Wilson, Franklin, and Eleanor Roosevelt, as well as Martin Luther King. Throughout its history, this country has proclaimed that the protection of the freedoms of men, both inside and outside its borders, is practically the reason for its existence. Although the United States was one of the first countries to recognize the existence of inalienable rights for individuals, as well as the main promoter of the Universal Declaration of Human Rights of 1948, the country has gone through critical moments that had led to questioning its commitment to the issue. Racial segregation, international military interventions, national security strategy, as well as national legislation on immigration, are some of the most controversial issues related to decisions and actions driven by the United States, which at the same time mismatched with its role as an advocate of human rights, both in the Americas and in the rest of the world. The aim of this paper is to study the swinging of the efforts and commitments of the United States towards human rights. The paper will analyze the history and evolution of human rights in the United States, to study the greatest challenges for the country in this matter. The paper will focus on both the domestic policy (related to demographic issues) and foreign policy (about its role in a post-war world). Currently, more countries are joining the multilateral efforts for the promotion and protection of human rights. At the same time, the United States is one of the least committed countries in this respect, having ratified only 5 of the 18 treaties emanating from the United Nations. The last ratification was carried out in 2002 and, since then, the country has been losing ground, in an increasingly vertiginous way, in its credibility and, even worse, in its role as leader of 'the free world'. With or without the United States, the protection of human rights should remain the main goal of the international community.

Keywords: United States, human rights, foreign policy, domestic policy

Procedia PDF Downloads 120
331 An Adaptive Conversational AI Approach for Self-Learning

Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo

Abstract:

In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.

Keywords: conversational AI, chatbot, dialog management, semantic analysis

Procedia PDF Downloads 136
330 Molecular Diagnosis of a Virus Associated with Red Tip Disease and Its Detection by Non Destructive Sensor in Pineapple (Ananas comosus)

Authors: A. K. Faizah, G. Vadamalai, S. K. Balasundram, W. L. Lim

Abstract:

Pineapple (Ananas comosus) is a common crop in tropical and subtropical areas of the world. Malaysia once ranked as one of the top 3 pineapple producers in the world in the 60's and early 70's, after Hawaii and Brazil. Moreover, government’s recognition of the pineapple crop as one of priority commodities to be developed for the domestics and international markets in the National Agriculture Policy. However, pineapple industry in Malaysia still faces numerous challenges, one of which is the management of disease and pest. Red tip disease on pineapple was first recognized about 20 years ago in a commercial pineapple stand located in Simpang Renggam, Johor, Peninsular Malaysia. Since its discovery, there has been no confirmation on its causal agent of this disease. The epidemiology of red tip disease is still not fully understood. Nevertheless, the disease symptoms and the spread within the field seem to point toward viral infection. Bioassay test on nucleic acid extracted from the red tip-affected pineapple was done on Nicotiana tabacum cv. Coker by rubbing the extracted sap. Localised lesions were observed 3 weeks after inoculation. Negative staining of the fresh inoculated Nicotiana tabacum cv. Coker showed the presence of membrane-bound spherical particles with an average diameter of 94.25nm under transmission electron microscope. The shape and size of the particles were similar to tospovirus. SDS-PAGE analysis of partial purified virions from inoculated N. tabacum produced a strong and a faint protein bands with molecular mass of approximately 29 kDa and 55 kDa. Partial purified virions of symptomatic pineapple leaves from field showed bands with molecular mass of approximately 29 kDa, 39 kDa and 55kDa. These bands may indicate the nucleocapsid protein identity of tospovirus. Furthermore, a handheld sensor, Greenseeker, was used to detect red tip symptoms on pineapple non-destructively based on spectral reflectance, measured as Normalized Difference Vegetation Index (NDVI). Red tip severity was estimated and correlated with NDVI. Linear regression models were calibrated and tested developed in order to estimate red tip disease severity based on NDVI. Results showed a strong positive relationship between red tip disease severity and NDVI (r= 0.84).

Keywords: pineapple, diagnosis, virus, NDVI

Procedia PDF Downloads 793
329 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 333
328 The Role of Accounting and Auditing in Anti-Corruption Strategies: The Case of ECOWAS

Authors: Edna Gnomblerou

Abstract:

Given the current scale of corruption epidemic in West African economies, governments are seeking for immediate and effective measures to reduce the likelihood of the plague within the region. Generally, accountants and auditors are expected to help organizations in detecting illegal practices. However, their role in the fight against corruption is sometimes limited due to the collusive nature of corruption. The Denmark anti-corruption model shows that the implementation of additional controls over public accounts and independent efficient audits improve transparency and increase the probability of detection. This study is aimed at reviewing the existing anti-corruption policies of the Economic Commission of West African States (ECOWAS) as to observe the role attributed to accounting, auditing and other managerial practices in their anti-corruption drive. It further discusses the usefulness of accounting and auditing in helping anti-corruption commissions in controlling misconduct and increasing the perception to detect irregularities within public administration. The purpose of this initiative is to identify and assess the relevance of accounting and auditing in curbing corruption. To meet this purpose, the study was designed to answer the questions of whether accounting and auditing processes were included in the reviewed anti-corruption strategies, and if yes, whether they were effective in the detection process. A descriptive research method was adopted in examining the role of accounting and auditing in West African anti-corruption strategies. The analysis reveals that proper recognition of accounting standards and implementation of financial audits are viewed as strategic mechanisms in tackling corruption. Additionally, codes of conduct, whistle-blowing and information disclosure to the public are among the most common managerial practices used throughout anti-corruption policies to effectively and efficiently address the problem. These observations imply that sound anti-corruption strategies cannot ignore the values of including accounting and auditing processes. On one hand, this suggests that governments should employ all resources possible to improve accounting and auditing practices in the management of public sector organizations. On the other hand, governments must ensure that accounting and auditing practices are not limited to the private sector, but when properly implemented constitute crucial mechanisms to control and reduce corrupt incentives in public sector.

Keywords: accounting, anti-corruption strategy, auditing, ECOWAS

Procedia PDF Downloads 258
327 Fake News Domination and Threats on Democratic Systems

Authors: Laura Irimies, Cosmin Irimies

Abstract:

The public space all over the world is currently confronted with the aggressive assault of fake news that have lately impacted public agenda setting, collective decisions and social attitudes. Top leaders constantly call out most mainstream news as “fake news” and the public opinion get more confused. "Fake news" are generally defined as false, often sensational, information disseminated under the guise of news reporting and has been declared the word of the year 2017 by Collins Dictionary and it also has been one of the most debated socio-political topics of recent years. Websites which, deliberately or not, publish misleading information are often shared on social media where they essentially increase their reach and influence. According to international reports, the exposure to fake news is an undeniable reality all over the world as the exposure to completely invented information goes up to the 31 percent in the US, and it is even bigger in Eastern Europe countries, such as Hungary (42%) and Romania (38%) or in Mediterranean countries, such as Greece (44%) or Turkey (49%), and lower in Northern and Western Europe countries – Germany (9%), Denmark (9%) or Holland (10%). While the study of fake news (mechanism and effects) is still in its infancy, it has become truly relevant as the phenomenon seems to have a growing impact on democratic systems. Studies conducted by the European Commission show that 83% of respondents out of a total of 26,576 interviewees consider the existence of news that misrepresent reality as a threat for democracy. Studies recently conducted at Arizona State University show that people with higher education can more easily spot fake headlines, but over 30 percent of them can still be trapped by fake information. If we were to refer only to some of the most recent situations in Romania, fake news issues and hidden agenda suspicions related to the massive and extremely violent public demonstrations held on August 10th, 2018 with a strong participation of the Romanian diaspora have been massively reflected by the international media and generated serious debates within the European Commission. Considering the above framework, the study raises four main research questions: 1. Is fake news a problem or just a natural consequence of mainstream media decline and the abundance of sources of information? 2. What are the implications for democracy? 3. Can fake news be controlled without restricting fundamental human rights? 4. How could the public be properly educated to detect fake news? The research uses mostly qualitative but also quantitative methods, content analysis of studies, websites and media content, official reports and interviews. The study will prove the real threat fake news represent and also the need for proper media literacy education and will draw basic guidelines for developing a new and essential skill: that of detecting fake in news in a society overwhelmed by sources of information that constantly roll massive amounts of information increasing the risk of misinformation and leading to inadequate public decisions that could affect democratic stability.

Keywords: agenda setting democracy, fake news, journalism, media literacy

Procedia PDF Downloads 131