Search results for: linguistic translation approaches
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5186

Search results for: linguistic translation approaches

1796 Enhanced Flight Dynamics Model to Simulate the Aircraft Response to Gust Encounters

Authors: Castells Pau, Poetsch Christophe

Abstract:

The effect of gust and turbulence encounters on aircraft is a wide field of study which allows different approaches, from high-fidelity multidisciplinary simulations to more simplified models adapted to industrial applications. The typical main goal is to predict the gust loads on the aircraft in order to ensure a safe design and achieve certification. Another topic widely studied is the gust loads reduction through an active control law. The impact of gusts on aircraft handling qualities is of interest as well in the analysis of in-service events so as to evaluate the aircraft response and the performance of the flight control laws. Traditionally, gust loads and handling qualities are addressed separately with different models adapted to the specific needs of each discipline. In this paper, an assessment of the differences between both models is presented and a strategy to better account for the physics of gust encounters in a typical flight dynamics model is proposed based on the model used for gust loads analysis. The applied corrections aim to capture the gust unsteady aerodynamics and propagation as well as the effect of dynamic flexibility at low frequencies. Results from the gust loads model at different flight conditions and measures from real events are used for validation. An assessment of a possible extension of steady aerodynamic nonlinearities to low frequency range is also addressed. The proposed corrections provide meaningful means to evaluate the performance and possible adjustments of the flight control laws.

Keywords: flight dynamics, gust loads, handling qualities, unsteady aerodynamics

Procedia PDF Downloads 144
1795 Ensuring Compliancy in Traditional Tibetan Medicine Treatment Through Patient Education

Authors: Nashalla Gwyn Nyinda

Abstract:

The ancient system of Tibetan Medicine, known as Sowa Rigpa across the Himalayan regions, is a systematic system of healing encouraging balance primarily through diet and behavior modifications. With the rise of the popularity of Tibetan Medicine, compliance is critical to successful treatment outcomes. As patients learn more about who they are as individuals and how their elemental balances or imbalances affect disorders and mental-emotional balance, they develop faith and dedication to their healing process. Specifically, regarding diet and behavior and the basic principles of the medical system, patient compliance increases dramatically in all treatment areas when they understand why a treatment or dietary prescription guidance is effective. Successful responses to Tibetan treatment rely on a buy-in from the patient. Trust between the slower process of Traditional medicine treatments, the Tibetan physician and the patient is a cornerstone of treatment. The resulting decrease in the use of allopathic medicine and better health outcomes for acute and chronic disorders are well documented. This paper addresses essential points of the Tibetan Medicine system, dialogue between doctor and patient focused on appropriate and seasonal changing dietetics. Such fluctuating treatment approaches, based on external elemental factors, dramatically increase treatment outcomes. Specifically, this work addresses why allopathic medicine models may need more trust development between practitioner and patient.

Keywords: compliancy in treatment, diet and lifestyle medicine, nature and elements as medicine, seasonal diets, Sowa Rigpa, traditional Tibetan medicine, treatment outcomes

Procedia PDF Downloads 60
1794 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 105
1793 Techno-Apocalypse in Christian End-Time Literature

Authors: Sean O'Callaghan

Abstract:

Around 2011/2012, a whole new genre of Christian religious writing began to emerge, focused on the role of advanced technologies, particularly the GRIN technologies (Genetics, Robotics, Information Technology and Nanotechnology), in bringing about a techno-apocalypse, leading to catastrophic events which would usher in the end of the world. This genre, at first niche, has now begun to grow in significance in many quarters of the more fundamentalist and biblically literalist branches of evangelicalism. It approaches science and technology with more than extreme skepticism. It accuses transhumanists of being in league with satanic powers and a satanic agenda and contextualizes transhumanist scientific progress in terms of its service to what it believes to be a soon to come Antichrist figure. The genre has moved beyond literature and videos about its message can be found on YouTube and other forums, where many of the presentations there get well over a quarter of a million views. This paper will examine the genre and its genesis, referring to the key figures involved in spreading the anti-intellectualist and anti-scientific message. It will demonstrate how this genre of writing is similar in many respects to other forms of apocalyptic writing which have emerged in the twentieth and twenty-first centuries, all in response to both scientific and political events which are interpreted in the light of biblical prophecy. It will also set the genre in the context of a contemporary pre-occupation with conspiracy theory. The conclusions of the research conducted in this field by the author are that it does a grave disservice to both the scientific and Christian audiences which it targets, by misrepresenting scientific advances and by creating a hermeneutic of suspicion which makes it impossible for Christians to place their trust in scientific claims.

Keywords: antichrist, catastrophic, Christian, techno-apocalypse

Procedia PDF Downloads 202
1792 Utilization of Sorghum and White Bean Flour for the Production of Gluten Free and Iron Rich Cookies

Authors: Tahra Elobeid, Emmerich Berghofer

Abstract:

The aim of this study is to find innovative approaches for the production of iron rich foods using natural iron sources. The vehicle used for fortification was sorghum whereas the iron fortificant was white bean. Fortified sorghum cookies were produced from five different mixtures; iron content, iron bioavailability, cookie texture and acceptability were measured. Cookies were prepared from the three fortified flours; 90% sorghum + 10% white bean (S9WB1), 75% sorghum + 25% white bean (S3WB1), 50% sorghum + 50% white bean (S1WB1) and 100% sorghum and 100% white bean. The functional properties gave good results in all the formulations. Statistical analysis of the iron content in the five different cookies showed that there was significant difference at the 95% confidence level (ANOVA). The iron content in all the recipes including the 100% sorghum improved, the increase ranging from 112% in 100% sorghum cookies to 476% in 100% white bean cookies. This shows that the increase in the amount of white bean used for fortification leads to the improvement of the iron content of cookies. The bioavailability of iron ranged from 21.3% in 100% sorghum to 28.6% in 100% white bean cookies. In the 100% sorghum cookies the iron bioavailability increased with reference to raw sorghum due to the addition of eggs. Bioavailability of iron in raw sorghum is 16.2%, therefore the percentage increase ranged from 5.1% to 28.6%. The cookies prepared from 10% white bean (S9WB1) scored the lowest 3.7 in terms of acceptability. They were the least preferred due to their somewhat soft texture. The 30% white bean cookies (S3WB1) gave results comparable to the 50% (S1WB1) and 100% white bean cookies. Cookies prepared with high percentage of white bean (50% and 100% white bean) gave the best results. Therefore cookie formulations from sorghum and white bean are successful in improving the iron status of anaemic individuals.

Keywords: sorghum, white bean, iron content, bioavailable iron, cookies

Procedia PDF Downloads 410
1791 Policies Promoting the Development of Green Buildings in Sub-Saharan Africa: A South African Case-Study

Authors: Peter Adekunle, Clinton Aigbavboa, Matthew Ikuabe, Opeoluwa Akinradewo

Abstract:

Contemporary building methods typically pay little attention to the built environment's greater economic, environmental, or social impacts or energy efficiency. Green construction aims to sever ties with these conventions. In order to provide better living and working conditions and lessen environmental consequences, green building today combines numerous building design, construction, and operation and maintenance approaches. As one of Sub-Saharan Africa's most industrialized nations, South Africa has a good number of green building projects. Therefore, this study examines the elements impacting the adoption of green buildings and regulations created to encourage the growth of green buildings using South Africa as a case study. The study has a survey-style design. A total of one hundred fifty (150) questionnaires were distributed to professionals in the construction industry in South Africa, of which one hundred and twenty-four (128) were returned and judged appropriate for investigation. The gathered data was examined using percentage, mean item scores, standard deviation, and Kruskal-Wallis. The findings show that cost and market circumstances are the two main elements impacting the adoption of green construction, while leadership advice is the most important policy. The study concluded that in order to encourage the construction of green buildings, additional Sub-Saharan nations should adopt these suggested policies.

Keywords: green building, Sub-Saharan Africa, building design, environmental conditions

Procedia PDF Downloads 107
1790 A Student Centered Learning Environment in Engineering Education: Design and a Longitudinal Study of Impact

Authors: Tom O'Mahony

Abstract:

This article considers the design of a student-centered learning environment in engineering education. The learning environment integrates a number of components, including project-based learning, collaborative learning, two-stage assignments, active learning lectures, and a flipped-classroom. Together these elements place the individual learner and their learning at the center of the environment by focusing on understanding, enhancing relevance, applying learning, obtaining rich feedback, making choices, and taking responsibility. The evolution of this environment from 2014 to the present day is outlined. The impact of this environment on learners and their learning is evaluated via student questionnaires that consist of both open and closed-ended questions. The closed questions indicate that students found the learning environment to be really interesting and enjoyable (rated as 4.7 on a 5 point scale) and encouraged students to adopt a deep approach towards studying the course materials (rated as 4.0 on a 5 point scale). A content analysis of the open-ended questions provides evidence that the project, active learning lectures, and flipped classroom all contribute to the success of this environment. Furthermore, this analysis indicates that the two-stage assessment process, in which feedback is provided between a draft and final assignment, is the key component and the dominant theme. A limitation of the study is the small class size (less than 20 learners per year), but, to some degree, this is compensated for by the longitudinal nature of the study.

Keywords: deep approaches, formative assessment, project-based learning, student-centered learning

Procedia PDF Downloads 109
1789 A Deep Learning Approach to Online Social Network Account Compromisation

Authors: Edward K. Boahen, Brunel E. Bouya-Moko, Changda Wang

Abstract:

The major threat to online social network (OSN) users is account compromisation. Spammers now spread malicious messages by exploiting the trust relationship established between account owners and their friends. The challenge in detecting a compromised account by service providers is validating the trusted relationship established between the account owners, their friends, and the spammers. Another challenge is the increase in required human interaction with the feature selection. Research available on supervised learning (machine learning) has limitations with the feature selection and accounts that cannot be profiled, like application programming interface (API). Therefore, this paper discusses the various behaviours of the OSN users and the current approaches in detecting a compromised OSN account, emphasizing its limitations and challenges. We propose a deep learning approach that addresses and resolve the constraints faced by the previous schemes. We detailed our proposed optimized nonsymmetric deep auto-encoder (OPT_NDAE) for unsupervised feature learning, which reduces the required human interaction levels in the selection and extraction of features. We evaluated our proposed classifier using the NSL-KDD and KDDCUP'99 datasets in a graphical user interface enabled Weka application. The results obtained indicate that our proposed approach outperformed most of the traditional schemes in OSN compromised account detection with an accuracy rate of 99.86%.

Keywords: computer security, network security, online social network, account compromisation

Procedia PDF Downloads 112
1788 Refined Edge Detection Network

Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni

Abstract:

Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.

Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone

Procedia PDF Downloads 98
1787 Ethnography of the Social and Cultural Perspectives of Childhood Neuro-Developmental Disorders: Implications for Health Seeking

Authors: Denis Nono, Catherine Abbo, Thomas Wenzel

Abstract:

Introduction: The study explored socio-cultural perspectives of childhood disorders and its implications for health seeking. Emphasis was on exploring local understanding and perceptions and how these ideas affect health seeking. Study aim: To explore the socio-cultural perspectives of neuro-developmental disorders and its implications on health seeking behaviour. Methods: The methods used in this study included key informant interviews conducted with health professionals. Parents of the children aged (6-15 years) with neuro-developmental disorders were recruited from the hospital to participate in focus group discussion, participant observation and individual in-depth interviews. Results: The study found out that stigma extended from children to parents and caregivers who were also shunned by community members. Participants described their children as “a gift from God” others described them as “a test from God”. The communities perceive the disorders as a spiritual infliction and always insisted that the children be taken for Acholi cultural and traditional rituals to cleanse children and they believed that mental illness has spiritual linkages. Conclusion: This study gives unique insights into the perceptions of neuro-developmental disorders and health seeking behavior in Gulu District and neighboring communities. The results showed that communities linked disorders to spiritual affliction, misunderstandings between families, bewitching, and other supernatural forces. Some of the participants highly recommended biomedical approaches to prevention, management and control of the disorders.

Keywords: ethnography, health seeking, neuro-developmental disorders, socio-cultural

Procedia PDF Downloads 137
1786 Mother Tongues and the Death of Women: Applying Feminist Theory to Historically, Linguistically, and Philosophically Contextualize the Current Abortion Debate in Bolivia

Authors: Jennifer Zelmer

Abstract:

The debate regarding the morality, and therefore legality, of abortion has many social, political, and medical ramifications worldwide. In a developing country like Bolivia, carrying a pregnancy to delivery is incredibly risky. Given the very high maternal mortality rate in Bolivia, greater consideration has been given to the (de)criminalization of abortion – a contributing cause of maternal death. In the spring of 2017, the Bolivian government proposed to loosen restrictions on women’s access to receiving a safe abortion, which was met with harsh criticism from 'pro-vida' (pro-life) factions. Although the current Bolivian government Movimiento al Socialismo (Movement Toward Socialism) portrays an agenda of decolonization, or to seek a 'traditionally-modern' society, nevertheless, Bolivia still has one of the highest maternal mortality rates in the Americas, because of centuries of colonial and patriarchal order. Applying a feminist critique and using the abortion debate as the central point, this paper argues that the 'traditionally-modern' society Bolivia strives towards is a paradox, and in fact only contributes to the reciprocal process of the death of 'mother tongues' and the unnecessary death of women. This claim is supported by a critical analysis of historical texts about Spanish Colonialism in Bolivia; the linguistic reality of reproductive educational strategies, and the philosophical framework which the Bolivian government and its citizens implement. This analysis is demonstrated in the current state of women’s access to reproductive healthcare in Cochabamba, Bolivia based on recent fieldwork which included audits of clinics and hospitals, interviews, and participant observation. This paper has two major findings: 1) the language used by opponents of abortion in Bolivia is not consistent with the claim of being 'pro-life' but more accurately with being 'pro-potential'; 2) when the topic of reproductive health appears in Cochabamba, Bolivia, it is often found written in the Spanish language, and does not cater to the many indigenous communities that inhabit or visit this city. Finally, this paper considers the crucial role of public health documentation to better inform the abortion debate, as well as the necessity of expanding reproductive health information to more than text-based materials in Cochabamba. This may include more culturally appropriate messages and mediums that cater to the oral tradition of the indigenous communities, who historically and currently have some of the highest fertility rates. If the objective of one who opposes abortion is to save human lives, then preventing the death of women should equally be of paramount importance. But rather, the 'pro-life' movement in Bolivia is willing to risk the lives of to-be mothers, by judicial punishment or death, for the chance of a potential baby. Until abortion is fully legal, safe, and accessible, there will always be the vestiges of colonial and patriarchal order in Bolivia which only perpetuates the needless death of women.

Keywords: abortion, feminist theory, Quechua, reproductive health education

Procedia PDF Downloads 162
1785 Composite Approach to Extremism and Terrorism Web Content Classification

Authors: Kolade Olawande Owoeye, George Weir

Abstract:

Terrorism and extremism activities on the internet are becoming the most significant threats to national security because of their potential dangers. In response to this challenge, law enforcement and security authorities are actively implementing comprehensive measures by countering the use of the internet for terrorism. To achieve the measures, there is need for intelligence gathering via the internet. This includes real-time monitoring of potential websites that are used for recruitment and information dissemination among other operations by extremist groups. However, with billions of active webpages, real-time monitoring of all webpages become almost impossible. To narrow down the search domain, there is a need for efficient webpage classification techniques. This research proposed a new approach tagged: SentiPosit-based method. SentiPosit-based method combines features of the Posit-based method and the Sentistrenght-based method for classification of terrorism and extremism webpages. The experiment was carried out on 7500 webpages obtained through TENE-webcrawler by International Cyber Crime Research Centre (ICCRC). The webpages were manually grouped into three classes which include the ‘pro-extremist’, ‘anti-extremist’ and ‘neutral’ with 2500 webpages in each category. A supervised learning algorithm is then applied on the classified dataset in order to build the model. Results obtained was compared with existing classification method using the prediction accuracy and runtime. It was observed that our proposed hybrid approach produced a better classification accuracy compared to existing approaches within a reasonable runtime.

Keywords: sentiposit, classification, extremism, terrorism

Procedia PDF Downloads 274
1784 Classification of Hyperspectral Image Using Mathematical Morphological Operator-Based Distance Metric

Authors: Geetika Barman, B. S. Daya Sagar

Abstract:

In this article, we proposed a pixel-wise classification of hyperspectral images using a mathematical morphology operator-based distance metric called “dilation distance” and “erosion distance”. This method involves measuring the spatial distance between the spectral features of a hyperspectral image across the bands. The key concept of the proposed approach is that the “dilation distance” is the maximum distance a pixel can be moved without changing its classification, whereas the “erosion distance” is the maximum distance that a pixel can be moved before changing its classification. The spectral signature of the hyperspectral image carries unique class information and shape for each class. This article demonstrates how easily the dilation and erosion distance can measure spatial distance compared to other approaches. This property is used to calculate the spatial distance between hyperspectral image feature vectors across the bands. The dissimilarity matrix is then constructed using both measures extracted from the feature spaces. The measured distance metric is used to distinguish between the spectral features of various classes and precisely distinguish between each class. This is illustrated using both toy data and real datasets. Furthermore, we investigated the role of flat vs. non-flat structuring elements in capturing the spatial features of each class in the hyperspectral image. In order to validate, we compared the proposed approach to other existing methods and demonstrated empirically that mathematical operator-based distance metric classification provided competitive results and outperformed some of them.

Keywords: dilation distance, erosion distance, hyperspectral image classification, mathematical morphology

Procedia PDF Downloads 80
1783 Expressing Locality in Learning English: A Study of English Textbooks for Junior High School Year VII-IX in Indonesia Context

Authors: Agnes Siwi Purwaning Tyas, Dewi Cahya Ambarwati

Abstract:

This paper concerns the language learning that develops as a habit formation and a constructive process while exercising an oppressive power to construct the learners. As a locus of discussion, the investigation problematizes the transfer of English language to Indonesian students of junior high school through the use of English textbooks ‘Real Time: An Interactive English Course for Junior High School Students Year VII-IX’. English language has long performed as a global language and it is a demand upon the non-English native speakers to master the language if they desire to become internationally recognized individuals. Generally, English teachers teach the language in accordance with the nature of language learning in which they are trained and expected to teach the language within the culture of the target language. This provides a potential soft cultural penetration of a foreign ideology through language transmission. In the context of Indonesia, learning English as international language is considered dilemmatic. Most English textbooks in Indonesia incorporate cultural elements of the target language which in some extent may challenge the sensitivity towards local cultural values. On the other hand, local teachers demand more English textbooks for junior high school students which can facilitate cultural dissemination of both local and global values and promote learners’ cultural traits of both cultures to avoid misunderstanding and confusion. It also aims to support language learning as bidirectional process instead of instrument of oppression. However, sensitizing and localizing this foreign language is not sufficient to restrain its soft infiltration. In due course, domination persists making the English language as an authoritative language and positioning the locality as ‘the other’. Such critical premise has led to a discursive analysis referring to how the cultural elements of the target language are presented in the textbooks and whether the local characteristics of Indonesia are able to gradually reduce the degree of the foreign oppressive ideology. The three textbooks researched were written by non-Indonesian author edited by two Indonesia editors published by a local commercial publishing company, PT Erlangga. The analytical elaboration examines the cultural characteristics in the forms of names, terminologies, places, objects and imageries –not the linguistic aspect– of both cultural domains; English and Indonesia. Comparisons as well as categorizations were made to identify the cultural traits of each language and scrutinize the contextual analysis. In the analysis, 128 foreign elements and 27 local elements were found in textbook for grade VII, 132 foreign elements and 23 local elements were found in textbook for grade VIII, while 144 foreign elements and 35 local elements were found in grade IX textbook, demonstrating the unequal distribution of both cultures. Even though the ideal pedagogical approach of English learning moves to a different direction by the means of inserting local elements, the learners are continuously imposed to the culture of the target language and forced to internalize the concept of values under the influence of the target language which tend to marginalize their native culture.

Keywords: bidirectional process, English, local culture, oppression

Procedia PDF Downloads 262
1782 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 106
1781 A Fuzzy Multi-Criteria Model for Sustainable Development of Community-Based Tourism through the Homestay Program in Malaysia

Authors: Azizah Ismail, Zainab Khalifah, Abbas Mardani

Abstract:

Sustainable community-based tourism through homestay programme is a growing niche market that has impacted destinations in many countries including Malaysia. With demand predicted to continue increasing, the importance of the homestay product will grow in the tourism industry. This research examines the sustainability criteria for homestay programme in Malaysia covering economic, socio-cultural and environmental dimensions. This research applied a two-stage methodology for data analysis. Specifically, the researcher implements a hybrid method which combines two multi-criteria decision making approaches. In the first stage of the methodology, the Decision Making Trial and Evaluation Laboratory (DEMATEL) technique is applied. Then, Analytical Network Process (ANP) is employed for the achievement of the objective of the current research. After factors identification and problem formulation, DEMATEL is used to detect complex relationships and to build a Network Relation Map (NRM). Then ANP is used to prioritize and find the weights of the criteria and sub-criteria of the decision model. The research verifies the framework of multi-criteria for sustainable community-based tourism from the perspective of stakeholders. The result also provides a different perspective on the importance of sustainable criteria from the view of multi-stakeholders. Practically, this research gives the framework model and helps stakeholders to improve and innovate the homestay programme and also promote community-based tourism.

Keywords: community-based tourism, homestay programme, sustainable tourism criteria, sustainable tourism development

Procedia PDF Downloads 128
1780 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery

Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong

Abstract:

The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.

Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition

Procedia PDF Downloads 284
1779 Regulating Nanocarrier and Mononuclear Phagocyte System Interactions through Esomeprazole-Based Preconditioning Strategy

Authors: Zakia Belhadj, Bing He, Hua Zhang, Xueqing Wang, Wenbing Dai, Qiang Zhang

Abstract:

Mononuclear phagocyte system (MPS) forms an abominable obstacle hampering the tumor delivery efficiency of nanoparticles. Passively targeted nanocarriers have received clinical approval over the past 20 years. However, none of the actively targeted nanocarriers have entered clinical trials. Thus it is important to endue effective targeting ability to actively targeted approaches by overcoming biological barriers to nanoparticle drug delivery. Here, it presents that an Esomeprazole-based preconditioning strategy for regulating nanocarrier-MPS interaction to substantially prolong circulation time and enhance tumor targeting of nanoparticles. In vitro, the clinically approved proton pump inhibitor Esomeprazole “ESO” was demonstrated to reduce interactions between macrophages and subsequently injected targeted vesicles by interfering with their lysosomal trafficking. Of note, in vivo studies demonstrated that ESO pretreatment greatly decreased the liver and spleen uptake of c(RGDm7)-modified vesicles, highly enhanced their tumor accumulation, thereby provided superior therapeutic efficacy of c(RGDm7)-modified vesicles co-loaded with Doxorubicin (DOX) and Gefitinib (GE). This MPS-preconditioning strategy using ESO provides deeper insights into regulating nanoparticles interaction with the phagocytic system and enhancing their cancer cells' accessibility for anticancer therapy.

Keywords: esomeprazole (ESO), mononuclear phagocyte system (MPS), preconditioning strategy, targeted lipid vesicles

Procedia PDF Downloads 173
1778 Design and Preliminary Evaluation of Benzoxazolone-Based Agents for Targeting Mitochondrial-Located Translocator Protein

Authors: Nidhi Chadha, A. K. Tiwari, Marilyn D. Milton, Anil K. Mishra

Abstract:

Translocator protein (18 kDa) TSPO is highly expressed during microglia activation in neuroinflammation. Although a number of PET ligands have been developed for the visualization of activated microglia, one of the advantageous approaches is to develop potential optical imaging (OI) probe. Our study involves computational screening, synthesis and evaluation of TSPO ligand through various imaging modalities namely PET/SPECT/Optical. The initial computational screening involves pharmacophore modeling from the library designing having oxo-benzooxazol-3-yl-N-phenyl-acetamide groups and synthesis for visualization of efficacy of these compounds as multimodal imaging probes. Structure modeling of monomer, Ala147Thr mutated, parallel and anti-parallel TSPO dimers was performed and docking analysis was performed for distinct binding sites. Computational analysis showed pattern of variable binding profile of known diagnostic ligands and NBMP via interactions with conserved residues along with TSPO’s natural polymorphism of Ala147→Thr, which showed alteration in the binding affinity due to considerable changes in tertiary structure. Preliminary in vitro binding studies shows binding affinity in the range of 1-5 nm and selectivity was also certified by blocking studies. In summary, this skeleton was found to be potential probe for TSPO imaging due to ease in synthesis, appropriate lipophilicity and reach to specific region of brain.

Keywords: TSPO, molecular modeling, imaging, docking

Procedia PDF Downloads 455
1777 An Integrated Approach for Risk Management of Transportation of HAZMAT: Use of Quality Function Deployment and Risk Assessment

Authors: Guldana Zhigerbayeva, Ming Yang

Abstract:

Transportation of hazardous materials (HAZMAT) is inevitable in the process industries. The statistics show a significant number of accidents has occurred during the transportation of HAZMAT. This makes risk management of HAZMAT transportation an important topic. The tree-based methods including fault-trees, event-trees and cause-consequence analysis, and Bayesian network, have been applied to risk management of HAZMAT transportation. However, there is limited work on the development of a systematic approach. The existing approaches fail to build up the linkages between the regulatory requirements and the safety measures development. The analysis of historical data from the past accidents’ report databases would limit our focus on the specific incidents and their specific causes. Thus, we may overlook some essential elements in risk management, including regulatory compliance, field expert opinions, and suggestions. A systematic approach is needed to translate the regulatory requirements of HAZMAT transportation into specified safety measures (both technical and administrative) to support the risk management process. This study aims to first adapt the House of Quality (HoQ) to House of Safety (HoS) and proposes a new approach- Safety Function Deployment (SFD). The results of SFD will be used in a multi-criteria decision-support system to develop find an optimal route for HazMats transportation. The proposed approach will be demonstrated through a hypothetical transportation case in Kazakhstan.

Keywords: hazardous materials, risk assessment, risk management, quality function deployment

Procedia PDF Downloads 138
1776 Introgressive Hybridisation between Two Widespread Sharks in the East Pacific Region

Authors: Diana A. Pazmino, Lynne vanHerwerden, Colin A. Simpfendorfer, Claudia Junge, Stephen C. Donnellan, Mauricio Hoyos-Padilla, Clinton A. J. Duffy, Charlie Huveneers, Bronwyn Gillanders, Paul A. Butcher, Gregory E. Maes

Abstract:

With just a handful of documented cases of hybridisation in cartilaginous fishes, shark hybridisation remains poorly investigated. Small amounts of admixture have been detected between Galapagos (Carcharhinus galapagensis) and dusky (Carcharhinus obscurus) sharks previously, generating a hypothesis of ongoing hybridisation. We sampled a large number of individuals from areas where both species co-occur (contact zones) across the Pacific Ocean and used both mitochondrial and nuclear-encoded SNPs to examine genetic admixture and introgression between the two species. Using empirical, analytical approaches and simulations, we first developed a set of 1,873 highly informative and reliable diagnostic SNPs for these two species to evaluate the degree of admixture between them. Overall, results indicate a high discriminatory power of nuclear SNPs (FST=0.47, p < 0.05) between the two species, unlike mitochondrial DNA (ΦST = 0.00 p > 0.05), which failed to differentiate between these species. We identified four hybrid individuals (~1%) and detected bi-directional introgression between C. galapagensis and C. obscurus in the Gulf of California along the eastern Pacific coast of the Americas. We emphasize the importance of including a combination of mtDNA and diagnostic nuclear markers to properly assess species identification, detect patterns of hybridisation, and better inform management and conservation of these sharks, especially given the morphological similarities within the genus Carcharhinus.

Keywords: elasmobranchs, single nucleotide polymorphisms, hybridisation, introgression, misidentification

Procedia PDF Downloads 190
1775 Development of GIS-Based Geotechnical Guidance Maps for Prediction of Soil Bearing Capacity

Authors: Q. Toufeeq, R. Kauser, U. R. Jamil, N. Sohaib

Abstract:

Foundation design of a structure needs soil investigation to avoid failures due to settlements. This soil investigation is expensive and time-consuming. Developments of new residential societies involve huge leveling of large sites that is accompanied by heavy land filling. Poor practices of land fill for deep depths cause differential settlements and consolidations of underneath soil that sometimes result in the collapse of structures. The extent of filling remains unknown to the individual developer unless soil investigation is carried out. Soil investigation cannot be performed on each available site due to involved costs. However, fair estimate of bearing capacity can be made if such tests are already done in the surrounding areas. The geotechnical guidance maps can provide a fair assessment of soil properties. Previously, GIS-based approaches have been used to develop maps using extrapolation and interpolations techniques for bearing capacities, underground recharge, soil classification, geological hazards, landslide hazards, socio-economic, and soil liquefaction mapping. Standard penetration test (SPT) data of surrounding sites were already available. Google Earth is used for digitization of collected data. Few points were considered for data calibration and validation. Resultant Geographic information system (GIS)-based guidance maps are helpful to anticipate the bearing capacity in the real estate industry.

Keywords: bearing capacity, soil classification, geographical information system, inverse distance weighted, radial basis function

Procedia PDF Downloads 132
1774 Dental Ethics versus Malpractice, as Phenomenon with a Growing Trend

Authors: Saimir Heta, Kers Kapaj, Rialda Xhizdari, Ilma Robo

Abstract:

Dealing with emerging cases of dental malpractice with justifications that stem from the clear rules of dental ethics is a phenomenon with an increasing trend in today's dental practice. Dentists should clearly understand how far the limit of malpractice goes, with or without minimal or major consequences, for the affected patient, which can be justified as a complication of dental treatment, in support of the rules of dental ethics in the dental office. Indeed, malpractice can occur in cases of lack of professionalism, but it can also come as a consequence of anatomical and physiological limitations in the implementation of the dental protocols, predetermined and indicated by the patient in the paragraph of the treatment plan in his personal card. This study is of the review type with the aim of the latest findings published in the literature about the problem of dealing with these phenomena. The combination of keywords is done in such a way with the aim to give the necessary space for collecting the right information in the networks of publications about this field, always first from the point of view of the dentist and not from that of the lawyer or jurist. From the findings included in this article, it was noticed the diversity of approaches towards the phenomenon depends on the different countries based on the legal basis that these countries have. There is a lack of or a small number of articles that touch on this topic, and these articles are presented with a limited number of data on the same topic. Conclusions: Dental malpractice should not be hidden under the guise of various dental complications that we justify with the strict rules of ethics for patients treated in the dental chair. The individual experience of dental malpractice must be published with the aim of serving as a source of experience for future generations of dentists.

Keywords: dental ethics, malpractice, professional protocol, random deviation

Procedia PDF Downloads 93
1773 Conceptual Synthesis as a Platform for Psychotherapy Integration: The Case of Transference and Overgeneralization

Authors: Merav Rabinovich

Abstract:

Background: Psychoanalytic and cognitive therapy attend problems from a different point of view. At the recent decade the integrating movement gaining momentum. However only little has been studied regarding the theoretical interrelationship among these therapy approaches. Method: 33 transference case-studies that were published in peer-reviewed academic journals were coded by Luborsky's Core Conflictual Relationship Theme (CCRT) method (components of wish, response from other – real or imaginal - and the response of self). CCRT analysis was conducted through tailor-made method, a valid tool to identify transference patterns. Rabinovich and Kacen's (2010, 2013) Relationship Between Categories (RBC) method was used to analyze the relationship among these transference patterns with cognitive and behavior components appearing at those psychoanalytic case-studies. Result: 30 of 33 cases (90%) were found to connect the transference themes with cognitive overgeneralization. In these cases, overgeneralizations were organized around Luborsky's transference themes of response from other and response of self. Additionally, overgeneralization was found to be an antithesis of the wish component, and the tension between them found to be linked with powerful behavioral and emotional reactions. Conclusion: The findings indicate that thinking distortions of overgeneralization (cognitive therapy) are the actual expressions of transference patterns. These findings point to a theoretical junction, a platform for clinical integration. Awareness to this junction can help therapists to promote well psychotherapy outcomes relying on the accumulative wisdom of the different therapies.

Keywords: transference, overgeneralization, theoretical integration, case-study metasynthesis, CCRT method, RBC method

Procedia PDF Downloads 138
1772 Terrorism in German and Italian Press Headlines: A Cognitive Linguistic Analysis of Conceptual Metaphors

Authors: Silvia Sommella

Abstract:

Islamic terrorism has gained a lot of media attention in the last years also because of the striking increase of terror attacks since 2014. The main aim of this paper is to illustrate the phenomenon of Islamic terrorism by applying frame semantics and metaphor analysis to German and Italian press headlines of the two online weekly publications Der Spiegel and L’Espresso between 2014 and 2019. This study focuses on how media discourse – through the use of conceptual metaphors – let arise in people a particular reception of the phenomenon of Islamic terrorism and accept governmental strategies and policies, perceiving terrorists as evildoers, as the members of an uncivilised group ‘other’ opposed to the civilised group ‘we’: two groups that are perceived as opposed. The press headlines are analyzed on the basis of the cognitive linguistics, namely Lakoff and Johnson’s conceptualization of metaphor to distinguish between abstract conceptual metaphors and specific metaphorical expressions. The study focuses on the contexts, frames, and metaphors. The method adopted in this study is Konerding’s frame semantics (1993). Konerding carried out on the basis of dictionaries – in particular of the Duden Deutsches Universalwörterbuch (Duden Universal German Dictionary) – in a pilot study of a lexicological work hyperonym reduction of substantives, working exclusively with nouns because hyperonyms usually occur in the dictionary meaning explanations as for the main elements of nominal phrases. The results of Konerding’s hyperonym type reduction is a small set of German nouns and they correspond to the highest hyperonyms, the so-called categories, matrix frames: ‘object’, ‘organism’, ‘person/actant’, ‘event’, ‘action/interaction/communication’, ‘institution/social group’, ‘surroundings’, ‘part/piece’, ‘totality/whole’, ‘state/property’. The second step of Konerding’s pilot study consists in determining the potential reference points of each category so that conventionally expectable routinized predications arise as predictors. Konerding found out which predicators the ascertained noun types can be linked to. For the purpose of this study, metaphorical expressions will be listed and categorized in conceptual metaphors and under the matrix frames that correspond to the particular conceptual metaphor. All of the corpus analyses are carried out using Ant Conc corpus software. The research will verify some previously analyzed metaphors such as TERRORISM AS WAR, A CRIME, A NATURAL EVENT, A DISEASE and will identify new conceptualizations and metaphors about Islamic terrorism, especially in the Italian language like TERRORISM AS A GAME, WARES, A DRAMATIC PLAY. Through the identification of particular frames and their construction, the research seeks to understand the public reception and the way to handle the discourse about Islamic terrorism in the above mentioned online weekly publications under a contrastive analysis in the German and in the Italian language.

Keywords: cognitive linguistics, frame semantics, Islamic terrorism, media

Procedia PDF Downloads 170
1771 A Review of Effective Gene Selection Methods for Cancer Classification Using Microarray Gene Expression Profile

Authors: Hala Alshamlan, Ghada Badr, Yousef Alohali

Abstract:

Cancer is one of the dreadful diseases, which causes considerable death rate in humans. DNA microarray-based gene expression profiling has been emerged as an efficient technique for cancer classification, as well as for diagnosis, prognosis, and treatment purposes. In recent years, a DNA microarray technique has gained more attraction in both scientific and in industrial fields. It is important to determine the informative genes that cause cancer to improve early cancer diagnosis and to give effective chemotherapy treatment. In order to gain deep insight into the cancer classification problem, it is necessary to take a closer look at the proposed gene selection methods. We believe that they should be an integral preprocessing step for cancer classification. Furthermore, finding an accurate gene selection method is a very significant issue in a cancer classification area because it reduces the dimensionality of microarray dataset and selects informative genes. In this paper, we classify and review the state-of-art gene selection methods. We proceed by evaluating the performance of each gene selection approach based on their classification accuracy and number of informative genes. In our evaluation, we will use four benchmark microarray datasets for the cancer diagnosis (leukemia, colon, lung, and prostate). In addition, we compare the performance of gene selection method to investigate the effective gene selection method that has the ability to identify a small set of marker genes, and ensure high cancer classification accuracy. To the best of our knowledge, this is the first attempt to compare gene selection approaches for cancer classification using microarray gene expression profile.

Keywords: gene selection, feature selection, cancer classification, microarray, gene expression profile

Procedia PDF Downloads 447
1770 An Inductive Study of Pop Culture Versus Visual Art: Redefined from the Lens of Censorship in Bangladesh

Authors: Ahmed Tahsin Shams

Abstract:

The right to dissent through any form of art has been facing challenges through various strict legal measures, particularly since 2018 when the Government of Bangladesh passed the Digital Security Act 2018 (DSA). Therefore, the references to ‘popular’ culture mostly include mainstream religious and national festivals and exclude critical intellectual representation of specific political allusions in any form of storytelling: whether wall art or fiction writing, since the post-DSA period in Bangladesh. Through inductive quantitative and qualitative methodological approaches, this paper aims to study the pattern of censorship, detention or custodial tortures against artists and the banning approach by the Bangladeshi government in the last five years, specifically against static visual arts, i.e., cartoon and wall art. The pattern drawn from these data attempts to redefine the popular notion of ‘pop culture’ as an unorganized folk or mass culture. The results also hypothesize how the post-DSA period forcefully constructs ‘pop culture’ as a very organized repetitive deception of enlightenment or entertainment. Thus the argument theorizes that this censoring trend is a fascist approach making the artists subaltern. So, in this socio-political context, these two similar and overlapping elements: culture and art, are vastly separated in two streams: the former being appreciated by the power, and the latter is a fearful concern for the power. Therefore, the purpose of art also shifts from entertainment to an act of rebellion, adding more layers to the new postmodern definition of ‘pop culture.’

Keywords: popular culture, visual arts, censoring trend, fascist approach, subaltern, digital security act

Procedia PDF Downloads 75
1769 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents

Authors: Subir Gupta, Subhas Ganguly

Abstract:

In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.

Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure

Procedia PDF Downloads 195
1768 Work in the Industry of the Future-Investigations of Human-Machine Interactions

Authors: S. Schröder, P. Ennen, T. Langer, S. Müller, M. Shehadeh, M. Haberstroh, F. Hees

Abstract:

Since a bit over a year ago, Festo AG and Co. KG, Festo Didactic SE, robomotion GmbH, the researchers of the Cybernetics-Lab IMA/ZLW and IfU, as well as the Human-Computer Interaction Center at the RWTH Aachen University, have been working together in the focal point of assembly competences to realize different scenarios in the field of human-machine interaction (HMI). In the framework of project ARIZ, questions concerning the future of production within the fourth industrial revolution are dealt with. There are many perspectives of human-robot collaboration that consist Industry 4.0 on an individual, organization and enterprise level, and these will be addressed in ARIZ. The aim of the ARIZ projects is to link AI-Approaches to assembly problems and to implement them as prototypes in demonstrators. To do so, island and flow based production scenarios will be simulated and realized as prototypes. These prototypes will serve as applications of flexible robotics as well as AI-based planning and control of production process. Using the demonstrators, human interaction strategies will be examined with an information system on one hand, and a robotic system on the other. During the tests, prototypes of workspaces that illustrate prospective production work forms will be represented. The human being will remain a central element in future productions and will increasingly be in charge of managerial tasks. Questions thus arise within the overall perspective, primarily concerning the role of humans within these technological revolutions, as well as their ability to act and design respectively to the acceptance of such systems. Roles, such as the 'Trainer' of intelligent systems may become a possibility in such assembly scenarios.

Keywords: human-machine interaction, information technology, island based production, assembly competences

Procedia PDF Downloads 199
1767 Comparative Performance Analysis for Selected Behavioral Learning Systems versus Ant Colony System Performance: Neural Network Approach

Authors: Hassan M. H. Mustafa

Abstract:

This piece of research addresses an interesting comparative analytical study. Which considers two concepts of diverse algorithmic computational intelligence approaches related tightly with Neural and Non-Neural Systems. The first algorithmic intelligent approach concerned with observed obtained practical results after three neural animal systems’ activities. Namely, they are Pavlov’s, and Thorndike’s experimental work. Besides a mouse’s trial during its movement inside figure of eight (8) maze, to reach an optimal solution for reconstruction problem. Conversely, second algorithmic intelligent approach originated from observed activities’ results for Non-Neural Ant Colony System (ACS). These results obtained after reaching an optimal solution while solving Traveling Sales-man Problem (TSP). Interestingly, the effect of increasing number of agents (either neurons or ants) on learning performance shown to be similar for both introduced systems. Finally, performance of both intelligent learning paradigms shown to be in agreement with learning convergence process searching for least mean square error LMS algorithm. While its application for training some Artificial Neural Network (ANN) models. Accordingly, adopted ANN modeling is a relevant and realistic tool to investigate observations and analyze performance for both selected computational intelligence (biological behavioral learning) systems.

Keywords: artificial neural network modeling, animal learning, ant colony system, traveling salesman problem, computational biology

Procedia PDF Downloads 467