Search results for: Text mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2254

Search results for: Text mining

514 An Investigation into Slow ESL Reading Speed in Pakistani Students

Authors: Hina Javed

Abstract:

This study investigated the different strategies used by Pakistani students learning English as a second language at secondary level school. The basic premise of the study is that ESL students face tremendous difficulty while they are reading a text in English. It also purports to dig into the different causes of their slow reading. They might range from word reading accuracy, mental translation, lexical density, cultural gaps, complex syntactic constructions, and back skipping. Sixty Grade 7 students from two secondary mainstream schools in Lahore were selected for the study, thirty being boys and thirty girls. They were administered reading-related and reading speed pre and post-tests. The purpose of the tests was to gauge their performance on different reading tasks so as to be able to see how they used strategies, if any, and also to ascertain the causes hampering their performance on those tests. In the pretests, they were given simple texts with considerable lexical density and moderately complex sentential layout. In the post-tests, the reading tasks contained comic strips, texts with visuals, texts with controlled vocabulary, and an evenly distributed varied range of simple, compound, and complex sentences. Both the tests were timed. The results gleaned through the data gathered corroborated the researchers’ basic hunch that they performed significantly better than pretests. The findings suggest that the morphological structure of words and lexical density are the main sources of reading comprehension difficulties in poor ESL readers. It is also confirmed that if the texts are accompanied by pictorial visuals, it greatly facilitates students’ reading speed and comprehension. There is no substantial evidence that ESL readers adopt any specific strategy while reading in English.

Keywords: slow ESL reading speed, mental translation, complex syntactic constructions, back skipping

Procedia PDF Downloads 69
513 Estimation of Natural Pozzolan Reserves in the Volcanic Province of the Moroccan Middle Atlas Using a Geographic Information System in Order to Valorize Them

Authors: Brahim Balizi, Ayoub Aziz, Abdelilah Bellil, Abdellali El Khadiri, Jamal Mabrouki

Abstract:

Mio-polio-quaternary volcanism of the Tabular Middle Atlas, which corresponds to prospective levels of exploitable usable raw minerals, is a feature of Morocco's Middle Atlas, especially the Azrou-Timahdite region. Given their importance in national policy in terms of human development by supporting the sociological and economic component, this area has consequently been the focus of various research and prospecting of these levels in order to develop these reserves. The outcome of this labor is a massive amount of data that needs to be managed appropriately because it comes from multiple sources and formats, including side points, contour lines, geology, hydrogeology, hydrology, geological and topographical maps, satellite photos, and more. In this regard, putting in place a Geographic Information System (GIS) is essential to be able to offer a side plan that makes it possible to see the most recent topography of the area being exploited, to compute the volume of exploitation that occurs every day, and to make decisions with the fewest possible restrictions in order to use the reserves for the realization of ecological light mortars The three sites' mining will follow the contour lines in five steps that are six meters high and decline. It is anticipated that each quarry produces about 90,000 m3/year. For a single quarry, this translates to a daily production of about 450 m3 (200 days/year). About 3,540,240 m3 and 10,620,720 m3, respectively, represent the possible net exploitable volume in place for a single quarry and the three exploitable zones.

Keywords: GIS, topography, exploitation, quarrying, lightweight mortar

Procedia PDF Downloads 18
512 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates

Authors: Bongs Lainjo

Abstract:

Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.

Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum

Procedia PDF Downloads 170
511 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia PDF Downloads 378
510 A Methodological Approach to Development of Mental Script for Mental Practice of Micro Suturing

Authors: Vaikunthan Rajaratnam

Abstract:

Intro: Motor imagery (MI) and mental practice (MP) can be an alternative to acquire mastery of surgical skills. One component of using this technique is the use of a mental script. The aim of this study was to design and develop a mental script for basic micro suturing training for skill acquisition using a low-fidelity rubber glove model and to describe the detailed methodology for this process. Methods: This study was based on a design and development research framework. The mental script was developed with 5 expert surgeons performing a cognitive walkthrough of the repair of a vertical opening in a rubber glove model using 8/0 nylon. This was followed by a hierarchal task analysis. A draft script was created, and face and content validity assessed with a checking-back process. The final script was validated with the recruitment of 28 participants, assessed using the Mental Imagery Questionnaire (MIQ). Results: The creation of the mental script is detailed in the full text. After assessment by the expert panel, the mental script had good face and content validity. The average overall MIQ score was 5.2 ± 1.1, demonstrating the validity of generating mental imagery from the mental script developed in this study for micro suturing in the rubber glove model. Conclusion: The methodological approach described in this study is based on an instructional design framework to teach surgical skills. This MP model is inexpensive and easily accessible, addressing the challenge of reduced opportunities to practice surgical skills. However, while motor skills are important, other non-technical expertise required by the surgeon is not addressed with this model. Thus, this model should act a surgical training augment, but not replace it.

Keywords: mental script, motor imagery, cognitive walkthrough, verbal protocol analysis, hierarchical task analysis

Procedia PDF Downloads 100
509 A Comparative Study for Various Techniques Using WEKA for Red Blood Cells Classification

Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy

Abstract:

Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifyig the red blood cells as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively

Keywords: red blood cells, classification, radial basis function neural networks, suport vector machine, k-nearest neighbors algorithm

Procedia PDF Downloads 477
508 Analyses of the Constitutional Identity in Hungary: A Case Study on the Concept of Constitutionalism and Legal Continuity in New Fundamental Law of Hungary

Authors: Zsuzsanna Fejes

Abstract:

The aim of this paper is to provide an overview of the legal history of constitutionalism in Hungary, in focus of the democratic transitions in 1989-1990, describing the historical and political background of the changes and presenting the main and most important features of the new democracy, and institutional and legal orders. In Hungary the evolved political, economic and moral crisis prior to the constitutional years 2010-11 had been such a constitutional moment, which led to an opportune and unavoidable change at the same time. The Hungarian constitutional power intended to adopt a new constitution, which was competent to create a common constitutional identity and to express a national unity. The Hungarian Parliament on 18th April 2011 passed the New Fundamental Law. The new Fundamental Law rich in national values meant a new challenge for the academics, lawyers, and political scientists. Not only the classical political science, but also the constitutional law and theory have to struggle with the interpretation of the new declarations about national constitutional values in the Fundamental Law. The main features and structure of the new Fundamental Law will be analysed, and given a detailed interpretation of the Preamble as a declaration of constitutional values. During the examination of the Preamble shall be cleared up the components of Hungarian statehood and national unity, individual and common human rights, the practical and theoretical demand on national sovereignty, and the content and possibilities for the interpretation of the achievements of the historical Constitution. These scopes of problems will be presented during the examination of the text of National Avowal, as a preamble of the Fundamental Law. It is examined whether the Fundamental Law itself could be suitable and sufficient means to citizens of Hungary to express the ideas therein as their own, it will be analysed how could the national and European common traditions, values and principles stated in the Fundamental Law mean maintenance in Hungary’s participation in the European integration.

Keywords: common constitutional values, constitutionalism, national identity, national sovereignty, national unity, statehood

Procedia PDF Downloads 291
507 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 405
506 Analyzing Factors Impacting COVID-19 Vaccination Rates

Authors: Dongseok Cho, Mitchell Driedger, Sera Han, Noman Khan, Mohammed Elmorsy, Mohamad El-Hajj

Abstract:

Since the approval of the COVID-19 vaccine in late 2020, vaccination rates have varied around the globe. Access to a vaccine supply, mandated vaccination policy, and vaccine hesitancy contribute to these rates. This study used COVID-19 vaccination data from Our World in Data and the Multilateral Leaders Task Force on COVID-19 to create two COVID-19 vaccination indices. The first index is the Vaccine Utilization Index (VUI), which measures how effectively each country has utilized its vaccine supply to doubly vaccinate its population. The second index is the Vaccination Acceleration Index (VAI), which evaluates how efficiently each country vaccinated its population within its first 150 days. Pearson correlations were created between these indices and country indicators obtained from the World Bank. The results of these correlations identify countries with stronger health indicators, such as lower mortality rates, lower age dependency ratios, and higher rates of immunization to other diseases, displaying higher VUI and VAI scores than countries with lesser values. VAI scores are also positively correlated to Governance and Economic indicators, such as regulatory quality, control of corruption, and GDP per capita. As represented by the VUI, proper utilization of the COVID-19 vaccine supply by country is observed in countries that display excellence in health practices. A country’s motivation to accelerate its vaccination rates within the first 150 days of vaccinating, as represented by the VAI, was largely a product of the governing body’s effectiveness and economic status, as well as overall excellence in health practises.

Keywords: data mining, Pearson correlation, COVID-19, vaccination rates and hesitancy

Procedia PDF Downloads 112
505 Phytoextraction of Copper and Zinc by Willow Varieties in a Pot Experiment

Authors: Muhammad Mohsin, Mir Md Abdus Salam, Pertti Pulkkinen, Ari Pappinen

Abstract:

Soil and water contamination by heavy metals is a major challenging issue for the environment. Phytoextraction is an emerging, environmentally friendly and cost-efficient technology in which plants are used to eliminate pollutants from the soil and water. We aimed to assess the copper (Cu) and zinc (Zn) removal efficiency by two willow varieties such as Klara (S. viminalis x S. schwerinii x S. dasyclados) and Karin ((S.schwerinii x S. viminalis) x (S. viminalis x S.burjatica)) under different soil treatments (control/unpolluted, polluted, lime with polluted, wood ash with polluted). In 180 days of pot experiment, these willow varieties were grown in a highly polluted soil collected from Pyhasalmi mining area in Finland. The lime and wood ash were added to the polluted soil to improve the soil pH and observe their effects on metals accumulation in plant biomass. The Inductively Coupled Plasma Optical Emission Spectrometer (ELAN 6000 ICP-EOS, Perkin-Elmer Corporation) was used in this study to assess the heavy metals concentration in the plant biomass. The result shows that both varieties of willow have the capability to accumulate the considerable amount of Cu and Zn varying from 36.95 to 314.80 mg kg⁻¹ and 260.66 to 858.70 mg kg⁻¹, respectively. The application of lime and wood ash substantially affected the stimulation of the plant height, dry biomass and deposition of Cu and Zn into total plant biomass. Besides, the lime application appeared to upsurge Cu and Zn concentrations in the shoots and leaves in both willow varieties when planted in polluted soil. However, wood ash application was found more efficient to mobilize the metals in the roots of both varieties. The study recommends willow plantations to rehabilitate the Cu and Zn polluted soils.

Keywords: heavy metals, lime, phytoextraction, wood ash, willow

Procedia PDF Downloads 235
504 Increasing Creativity in Virtual Learning Space for Developing Creative Cities

Authors: Elham Fariborzi, Hoda Anvari Kazemabad

Abstract:

Today, ICT plays an important role in all matters and it affects the development of creative cities. According to virtual space in this technology, it use especially for expand terms like smart schools, Virtual University, web-based training and virtual classrooms that is in parallel with the traditional teaching. Nowadays, the educational systems in different countries such as Iran are changing and start increasing creativity in the learning environment. It will contribute to the development of innovative ideas and thinking of the people in this environment; such opportunities might be cause scientific discovery and development issues. The creativity means the ability to generate ideas and numerous, new and suitable solutions for solving the problems of real and virtual individuals and society, which can play a significant role in the development of creative current physical cities or virtual borders ones in the future. The purpose of this paper is to study strategies to increase creativity in a virtual learning to develop a creative city. In this paper, citation/ library study was used. The full description given in the text, including how to create and enhance learning creativity in a virtual classroom by reflecting on performance and progress; attention to self-directed learning guidelines, efficient use of social networks, systematic discussion groups and non-intuitive targeted controls them by involved factors and it may be effective in the teaching process regarding to creativity. Meanwhile, creating a virtual classroom the style of class recognizes formally the creativity. Also the use of a common model of creative thinking between student/teacher is effective to solve problems of virtual classroom. It is recommended to virtual education’ authorities in Iran to have a special review to the virtual curriculum for increasing creativity in educational content and such classes to be witnesses more creative in Iran's cities.

Keywords: virtual learning, creativity, e-learning, bioinformatics, biomedicine

Procedia PDF Downloads 358
503 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models

Authors: Danielle Shackley, Yetunde Folajimi

Abstract:

As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.

Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model

Procedia PDF Downloads 92
502 Lexical Based Method for Opinion Detection on Tripadvisor Collection

Authors: Faiza Belbachir, Thibault Schienhinski

Abstract:

The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.

Keywords: Tripadvisor, opinion detection, SentiWordNet, trust score

Procedia PDF Downloads 195
501 Supply Chain Optimisation through Geographical Network Modeling

Authors: Cyrillus Prabandana

Abstract:

Supply chain optimisation requires multiple factors as consideration or constraints. These factors are including but not limited to demand forecasting, raw material fulfilment, production capacity, inventory level, facilities locations, transportation means, and manpower availability. By knowing all manageable factors involved and assuming the uncertainty with pre-defined percentage factors, an integrated supply chain model could be developed to manage various business scenarios. This paper analyse the utilisation of geographical point of view to develop an integrated supply chain network model to optimise the distribution of finished product appropriately according to forecasted demand and available supply. The supply chain optimisation model shows that small change in one supply chain constraint is possible to largely impact other constraints, and the new information from the model should be able to support the decision making process. The model was focused on three areas, i.e. raw material fulfilment, production capacity and finished products transportation. To validate the model suitability, it was implemented in a project aimed to optimise the concrete supply chain in a mining location. The high level of operations complexity and involvement of multiple stakeholders in the concrete supply chain is believed to be sufficient to give the illustration of the larger scope. The implementation of this geographical supply chain network modeling resulted an optimised concrete supply chain from raw material fulfilment until finished products distribution to each customer, which indicated by lower percentage of missed concrete order fulfilment to customer.

Keywords: decision making, geographical supply chain modeling, supply chain optimisation, supply chain

Procedia PDF Downloads 344
500 Morphemic Analysis Awareness: A Boon or Bane on ESL Students’ Vocabulary Learning Strategy

Authors: Chandrakala Varatharajoo, Adelina Binti Asmawi, Nabeel Abdallah Mohammad Abedalaziz

Abstract:

This study investigated the impact of inflectional and derivational morphemic analysis awareness on ESL secondary school students’ vocabulary learning strategy. The quasi-experimental study was conducted with 106 low proficiency secondary school students in two experimental groups (inflectional and derivational) and one control group. The students’ vocabulary acquisition was assessed through two measures: Morphemic Analysis Test and Vocabulary- Morphemic Test in the pretest and posttest before and after an intervention programme. Results of ANCOVA revealed that both the experimental groups achieved a significant score in Morphemic Analysis Test and Vocabulary-Morphemic Test. However, the inflectional group obtained a fairly higher score than the derivational group. Thus, the results indicated that ESL low proficiency secondary school students performed better on inflectional morphemic awareness as compared to derivatives. The results also showed that the awareness of inflectional morphology contributed more on the vocabulary acquisition. Importantly, learning inflectional morphology can help ESL low proficiency secondary school students to develop both morphemic awareness and vocabulary gain. Theoretically, these findings show that not all morphemes are equally useful to students for their language development. Practically, these findings indicate that morphological instruction should at least be included in remediation and instructional efforts with struggling learners across all grade levels, allowing them to focus on meaning within the word before they attempt the text in large for better comprehension. Also, by methodologically, by conducting individualized intervention and assessment this study provided fresh empirical evidence to support the existing literature on morphemic analysis awareness and vocabulary learning strategy. Thus, a major pedagogical implication of the study is that morphemic analysis awareness strategy is a definite boon for ESL secondary school students in learning English vocabulary.

Keywords: ESL, instruction, morphemic analysis, vocabulary

Procedia PDF Downloads 396
499 Assessment of Drinking Water Quality in Relation to Arsenic Contamination in Drinking Water in Liberia: Achieving the Sustainable Development Goal of Ensuring Clean Water and Sanitation

Authors: Victor Emery David Jr., Jiang Wenchao, Daniel Mmereki, Yasinta John

Abstract:

The fundamentals of public health are access to safe and clean drinking water. The presence of arsenic and other contaminants in drinking water leads to the potential risk to public health and the environment particularly in most developing countries where there’s inadequate access to safe and clean water and adequate sanitation. Liberia has taken steps to improve its drinking water status so as to achieve the Sustainable Development Goals (SDGs) target of ensuring clean water and effective sanitation but there is still a lot to be done. The Sustainable Development Goals are a United Nation initiative also known as transforming our world: The 2030 agenda for sustainable development. It contains seventeen goals with 169 targets to be met by respective countries. Liberia is situated within in the gold belt region where there exist the presence of arsenic and other contaminants in the underground water due to mining and other related activities. While there are limited or no epidemiological studies conducted in Liberia to confirm illness or death as a result of arsenic contamination in Liberia, it remains a public health concern. This paper assesses the drinking water quality, the presence of arsenic in groundwater/drinking water in Liberia, and proposes strategies for mitigating contaminants in drinking water and suggests options for improvement with regards to achieving the Sustainable Development Goals of ensuring clean water and effective sanitation in Liberia by 2030.

Keywords: arsenic, action plan, contaminants, environment, groundwater, sustainable development goals (SDGs), Monrovia, Liberia, public health, drinking water

Procedia PDF Downloads 256
498 Socio-Cultural Representations through Lived Religions in Dalrymple’s Nine Lives

Authors: Suman

Abstract:

In the continuous interaction between the past and the present that historiography is, each time when history gets re/written, a new representation emerges. This new representation is a reflection of the earlier archives and their interpretations, fragmented remembrances of the past, as well as the reactions to the present. Memory, or lack thereof, and stereotyping generally play a major role in this representation. William Dalrymple’s Nine Lives: In Search of the Sacred in Modern India (2009) is one such written account that sets out to narrate the representations of religion and culture of India and contemporary reactions to it. Dalrymple’s nine saints belong to different castes, sects, religions, and regions. By dealing with their religions and expressions of those religions, and through the lived mysticism of these nine individuals, the book engages with some important issues like class, caste and gender in the contexts provided by historical as well as present India. The paper studies the development of religion and accompanied feeling of religiosity in modern as well as historical contexts through a study of these elements in the book. Since, the language used in creation of texts and the literary texts thus produced create a new reality that questions the stereotypes of the past, and in turn often end up creating new stereotypes or stereotypical representations at times, the paper seeks to actively engage with the text in order to identify and study such stereotypes, along with their changing representations. Through a detailed examination of the book, the paper seeks to unravel whether some socio-cultural stereotypes existed earlier, and whether there is development of new stereotypes from Dalrymple’s point of view as an outsider writing on issues that are deeply rooted in the cultural milieu of the country. For this analysis, the paper takes help from the psycho-literary theories of stereotyping and representation.

Keywords: stereotyping, representation, William Dalrymple, religion

Procedia PDF Downloads 307
497 Job in Modern Arabic Poetry: A Semantic and Comparative Approach to Two Poems Referring to the Poet Al-Sayyab

Authors: Jeries Khoury

Abstract:

The use of legendary, folkloric and religious symbols is one of the most important phenomena in modern Arabic poetry. Interestingly enough, most of the modern Arabic poetry’s pioneers were so fascinated by the biblical symbols and they managed to use many modern techniques to make these symbols adequate for their personal life from one side and fit to their Islamic beliefs from the other. One of the most famous poets to do so was al-Sayya:b. The way he employed one of these symbols ‘job’, the new features he adds to this character and the link between this character and his personal life will be discussed in this study. Besides, the study will examine the influence of al-Sayya:b on another modern poet Saadi Yusuf, who, following al-Sayya:b, used the character of Job in a special way, by mixing its features with al-Sayya:b’s personal features and in this way creating a new mixed character. A semantic, cultural and comparative analysis of the poems written by al-Sayya:b himself and the other poets who evoked the mixed image of al-Sayya:b-Job, can reveal the changes Arab poets made to the original biblical figure of Job to bring it closer to Islamic culture. The paper will make an intensive use of intertextuality idioms in order to shed light on the network of relations between three kinds of texts (indeed three palimpsests’: 1- biblical- the primary text; 2- poetic- al-Syya:b’s secondary version; 3- re-poetic- Sa’di Yusuf’s tertiary version). The bottom line in this paper is that that al-Sayya:b was directly influenced by the dramatic biblical story of Job more than the brief Quranic version of the story. In fact, the ‘new’ character of Job designed by al-Sayya:b himself differs from the original one in many aspects that we can safely say it is the Sayyabian-Job that cannot be found in the poems of any other poets, unless they are evoking the own tragedy of al-Sayya:b himself, like what Saadi Yusuf did.

Keywords: Arabic poetry, intertextuality, job, meter, modernism, symbolism

Procedia PDF Downloads 193
496 A Case Study of Coalface Workers' Attitude towards Occupational Health and Safety Key Performance Indicators

Authors: Gayan Mapitiya

Abstract:

Maintaining good occupational health and safety (OHS) performance is significant at the coalface, especially in industries such as mining, power, and construction. Coalface workers are vulnerable to high OHS risks such as working at heights, working with mobile plants and vehicles, working with underground and above ground services, chemical emissions, radiation hazards and explosions at everyday work. To improve OHS performance of workers, OHS key performance indicators (KPIs) (for example, lost time injuries (LTI), serious injury frequency rate (SIFR), total reportable injury frequency rate (TRIFR) and number of near misses) are widely used by managers in making OHS business decisions such as investing in safety equipment and training programs. However, in many organizations, workers at the coalface hardly see any relevance or value addition of OHS KPIs to their everyday work. Therefore, the aim of the study was to understand why coalface workers perceive that OHS KPIs are not practically relevant to their jobs. Accordingly, this study was conducted as a qualitative case study focusing on a large electricity and gas firm in Australia. Semi-structured face to face interviews were conducted with selected coalface workers to gather data on their attitude towards OHS KPIs. The findings of the study revealed that workers at the coalface generally have no understanding of the purpose of KPIs, the meaning of each KPI, origin of KPIs, and how KPIs are correlated to organizational performance. Indeed, KPIs are perceived as ‘meaningless obstacles’ imposed on workers by managers without a rationale. It is recommended to engage coalface workers (a fair number of representatives) in both KPIs setting and revising processes while maintaining a continuous dialogue between workers and managers in regards OHS KPIs.

Keywords: KPIs, coalface, OHS risks, case-study

Procedia PDF Downloads 111
495 Depletion Behavior of Potassium by Continuous Cropping Using Rice as a Test Crop

Authors: Rafeza Begum, Mohammad Mokhlesur Rahman, Safikul Moula, Rafiqul Islam

Abstract:

Potassium (K) is crucial for healthy soil and plant growth. However, K fertilization is either disregarded or poorly underutilized in Bangladesh agriculture, despite the great demand for crops. This could eventually result in a significant depletion of the soil's potassium reserves, irreversible alteration of the minerals that contain potassium, and detrimental effects on crop productivity. Soil K mining in Bangladesh is a worrying problem, and we need to evaluate it thoroughly and find remedies. A pot culture experiment was conducted in the greenhouse of Bangladesh Institute of Nuclear Agriculture (BINA) using eleven soil series of Bangladesh in order to see the depletion behaviour of potassium (K) by continuous cropping using rice (var. Iratom-24) as the test crop. The soil series were Ranishankhail, Kaonia. Sonatala, Silmondi, Gopalpur, Ishurdi, Sara, Kongsha, Nunni, Lauta and Amnura on which four successive rice plants (45 days duration) were raised with (100 ppm K) or without addition of potassium. Nitrogen, phosphorus, sulfur and zinc were applied as basal to all pots. Potassium application resulted in higher dry matter yield, increased K concentration and uptake in all the soils compared with no K treatment; which gradually decreased in the subsequent harvests. Furthermore, plant takes up K not only from exchangeable pool but also from non-exchangeable sites and a minimum replenishment of K from the soil reserve was observed. Continuous cropping has resulted in the depletion of available K of the soil. The result indicated that in order to sustain higher crop yield under intensive cultivation, the addition of potash fertilizer is necessary.

Keywords: potassium, exchangeable pool, depletion behavior., Soil series

Procedia PDF Downloads 122
494 Co-Disposal of Coal Ash with Mine Tailings in Surface Paste Disposal Practices: A Gold Mining Case Study

Authors: M. L. Dinis, M. C. Vila, A. Fiúza, A. Futuro, C. Nunes

Abstract:

The present paper describes the study of paste tailings prepared in laboratory using gold tailings, produced in a Finnish gold mine with the incorporation of coal ash. Natural leaching tests were conducted with the original materials (tailings, fly and bottom ashes) and also with paste mixtures that were prepared with different percentages of tailings and ashes. After leaching, the solid wastes were physically and chemically characterized and the results were compared to those selected as blank – the unleached samples. The tailings and the coal ash, as well as the prepared mixtures, were characterized, in addition to the textural parameters, by the following measurements: grain size distribution, chemical composition and pH. Mixtures were also tested in order to characterize their mechanical behavior by measuring the flexural strength, the compressive strength and the consistency. The original tailing samples presented an alkaline pH because during their processing they were previously submitted to pressure oxidation with destruction of the sulfides. Therefore, it was not possible to ascertain the effect of the coal ashes in the acid mine drainage. However, it was possible to verify that the paste reactivity was affected mostly by the bottom ash and that the tailings blended with bottom ash present lower mechanical strength than when blended with a combination of fly and bottom ash. Surface paste disposal offer an attractive alternative to traditional methods in addition to the environmental benefits of incorporating large-volume wastes (e.g. bottom ash). However, a comprehensive characterization of the paste mixtures is crucial to optimize paste design in order to enhance engineer and environmental properties.

Keywords: coal ash, mine tailings, paste blends, surface disposal

Procedia PDF Downloads 291
493 Factors That Contribute to Noise Induced Hearing Loss Amongst Employees at the Platinum Mine in Limpopo Province, South Africa

Authors: Livhuwani Muthelo, R. N. Malema, T. M. Mothiba

Abstract:

Long term exposure to excessive noise in the mining industry increases the risk of noise induced hearing loss, with consequences for employee’s health, productivity and the overall quality of life. Objective: The objective of this study was to investigate the factors that contribute to Noise Induced Hearing Loss amongst employees at the Platinum mine in the Limpopo Province, South Africa. Study method: A qualitative, phenomenological, exploratory, descriptive, contextual design was applied in order to explore and describe the contributory factors. Purposive non-probability sampling was used to select 10 male employees who were diagnosed with NIHL in the year 2014 in four mine shafts, and 10 managers who were involved in a Hearing Conservation Programme. The data were collected using semi-structured one-on-one interviews. A qualitative data analysis of Tesch’s approach was followed. Results: The following themes emerged: Experiences and challenges faced by employees in the work environment, hearing protective device factors and management and leadership factors. Hearing loss was caused by partial application of guidelines, policies, and procedures from the Department of Minerals and Energy. Conclusion: The study results indicate that although there are guidelines, policies, and procedures available, failure in the implementation of one element will affect the development and maintenance of employees hearing mechanism. It is recommended that the mine management should apply the guidelines, policies, and procedures and promptly repair the broken hearing protective devices.

Keywords: employees, factors, noise induced hearing loss, noise exposure

Procedia PDF Downloads 120
492 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 465
491 In-Situ Determination of Radioactivity Levels and Radiological Hazards in and around the Gold Mine Tailings of the West Rand Area, South Africa

Authors: Paballo M. Moshupya, Tamiru A. Abiye, Ian Korir

Abstract:

Mining and processing of naturally occurring radioactive materials could result in elevated levels of natural radionuclides in the environment. The aim of this study was to evaluate the radioactivity levels on a large scale in the West Rand District in South Africa, which is dominated by abandoned gold mine tailings and the consequential radiological exposures to members of the public. The activity concentrations of ²³⁸U, ²³²Th and 40K in mine tailings, soil and rocks were assessed using the BGO Super-Spec (RS-230) gamma spectrometer. The measured activity concentrations for ²³⁸U, ²³²Th and 40K in the studied mine tailings were found to range from 209.95 to 2578.68 Bq/kg, 19.49 to 108.00 Bq/kg and 31.30 to 626.00 Bq/kg, respectively. In surface soils, the overall average activity concentrations were found to be 59.15 Bq/kg, 34.91 and 245.64 Bq/kg for 238U, ²³²Th and 40K, respectively. For the rock samples analyzed, the mean activity concentrations were 32.97 Bq/kg, 32.26 Bq/kg and 351.52 Bg/kg for ²³⁸U, ²³²Th and 40K, respectively. High radioactivity levels were found in mine tailings, with ²³⁸U contributing significantly to the overall activity concentration. The external gamma radiation received from surface soil in the area is generally low, with an average of 0.07 mSv/y. The highest annual effective doses were estimated from the tailings dams and the levels varied between 0.14 mSv/y and 1.09 mSv/y, with an average of 0.51 mSv/y. In certain locations, the recommended dose constraint of 0.25 mSv/y from a single source to the average member of the public within the exposed population was exceeded, indicating the need for further monitoring and regulatory control measures specific to these areas to ensure the protection of resident members of the public.

Keywords: activity concentration, gold mine tailings, in-situ gamma spectrometry, radiological exposures

Procedia PDF Downloads 122
490 Organotin (IV) Based Complexes as Promiscuous Antibacterials: Synthesis in vitro, in Silico Pharmacokinetic, and Docking Studies

Authors: Wajid Rehman, Sirajul Haq, Bakhtiar Muhammad, Syed Fahad Hassan, Amin Badshah, Muhammad Waseem, Fazal Rahim, Obaid-Ur-Rahman Abid, Farzana Latif Ansari, Umer Rashid

Abstract:

Five novel triorganotin (IV) compounds have been synthesized and characterized. The tin atom is penta-coordinated to assume trigonal-bipyramidal geometry. Using in silico derived parameters; the objective of our study is to design and synthesize promiscuous antibacterials potent enough to combat resistance. Among various synthesized organotin (IV) complexes, compound 5 was found as potent antibacterial agent against various bacterial strains. Further lead optimization of drug-like properties was evaluated through in silico predictions. Data mining and computational analysis were utilized to derive compound promiscuity phenomenon to avoid drug attrition rate in designing antibacterials. Xanthine oxidase and human glucose- 6-phosphatase were found as only true positive off-target hits by ChEMBL database and others utilizing similarity ensemble approach. Propensity towards a-3 receptor, human macrophage migration factor and thiazolidinedione were found as false positive off targets with E-value 1/4> 10^-4 for compound 1, 3, and 4. Further, displaying positive drug-drug interaction of compound 1 as uricosuric was validated by all databases and docked protein targets with sequence similarity and compositional matrix alignment via BLAST software. Promiscuity of the compound 5 was further confirmed by in silico binding to different antibacterial targets.

Keywords: antibacterial activity, drug promiscuity, ADMET prediction, metallo-pharmaceutical, antimicrobial resistance

Procedia PDF Downloads 500
489 Cadaveric Assessment of Kidney Dimensions Among Nigerians - A Preliminary Report

Authors: Rotimi Sunday Ajani, Omowumi Femi-Akinlosotu

Abstract:

Background: The usually paired human kidneys are retroperitoneal urinary organs with some endocrine functions. Standard text books of anatomy ascribe single value to each of the dimension of length, width and thickness. Research questions: These values do not give consideration to racial and genetic variability in human morphology. They may thus be erroneous to students and clinicians working on Nigerians. Objectives: The study aimed at establishing reference values of the kidney length, width and thickness for Nigerians using the cadaveric model. Methodology: The length, width, thickness and weight of sixty kidneys harvested from cadavers of thirty adult Nigerians (Male: Female; 27: 3) were measured. Respective volume was calculated using the ellipsoid formula. Results: The mean length of the kidney was 9.84±0.89 cm (9.63±0.88 {right}; 10.06±0.86 {left}), width- 5.18±0.70 cm (5.21±0.72 {right}; 5.14±0.70 {left}), thickness-3.45±0.56 cm (3.36±0.58 {right}, 3.53±0.55 {left}), weight-125.06±22.34 g (122.36±21.70 {right}; 127.76 ±24.02 {left}) and volume of 95.45± 24.40 cm3 (91.73± 26.84 {right}; 99.17± 25.75 {left}). Discussion: Though the values of the parameters measured were higher for the left kidney (except for the width), they were not statistically significant. The various parameters obtained by this study differ from those of similar studies from other continents. Conclusion: Stating single value for each of the parameter of length, width and thickness of the kidney as currently obtained in textbooks of anatomy may be incomplete information and hence misleading. Thus, there is the need to emphasize racial differences when stating the normal values of kidney dimensions in textbooks of anatomy. Implication for Research and Innovation: The results of the study showed the dimensions of the kidney (length, width and thickness) have interracial vagaries as they were different from those of similar studies and values stated in standard textbooks of human anatomy. Future direction: This is a preliminary report and the study will continue so that more data will be obtained.

Keywords: kidney dimensions, cadaveric estimation, adult nigerians, racial differences

Procedia PDF Downloads 92
488 Investigation of Wood Chips as Internal Carbon Source Supporting Denitrification Process in Domestic Wastewater Treatment

Authors: Ruth Lorivi, Jianzheng Li, John J. Ambuchi, Kaiwen Deng

Abstract:

Nitrogen removal from wastewater is accomplished by nitrification and denitrification processes. Successful denitrification requires carbon, therefore, if placed after biochemical oxygen demand (BOD) and nitrification process, a carbon source has to be re-introduced into the water. To avoid adding a carbon source, denitrification is usually placed before BOD and nitrification processes. This process however involves recycling the nitrified effluent. In this study wood chips were used as internal carbon source which enabled placement of denitrification after BOD and nitrification process without effluent recycling. To investigate the efficiency of a wood packed aerobic-anaerobic baffled reactor on carbon and nutrients removal from domestic wastewater, a three compartment baffled reactor was presented. Each of the three compartments was packed with 329 g wood chips 1x1cm acting as an internal carbon source for denitrification. The proposed mode of operation was aerobic-anoxic-anaerobic (OAA) with no effluent recycling. The operating temperature, hydraulic retention time (HRT), dissolved oxygen (DO) and pH were 24 ± 2 , 24 h, less than 4 mg/L and 7 ± 1 respectively. The removal efficiencies of chemical oxygen demand (COD), ammonia nitrogen (NH4+-N) and total nitrogen (TN) attained was 99, 87 and 83% respectively. TN removal rate was limited by nitrification as 97% of ammonia converted into nitrate and nitrite was denitrified. These results show that application of wood chips in wastewater treatment processes is an efficient internal carbon source. 

Keywords: aerobic-anaerobic baffled reactor, denitrification, nitrification, wood chip

Procedia PDF Downloads 293
487 Spirituality Enhanced with Cognitive-Behavioural Techniques: An Effective Method for Women with Extramarital Infidelity: A Literature Review

Authors: Setareh Yousife

Abstract:

Introduction: Studies suggest that Extramarital Infidelity (EMI) variants, such as sexual and emotional infidelities are increasing in marriage relationships. To our knowledge, less is known about what therapies and mental-hygiene factors can prevent more effective this behavior and address it. Spiritual and cognitive-behavioural health have proven to reduce marital conflict, Increase marital satisfaction and commitment. Objective: This study aims to discuss the effectiveness of spiritual counseling combined with Cognitive-behavioural techniques in addressing Extramarital Infidelity. Method: Descriptive, analytical, and intervention articles indexed in SID, Noormags, Scopus, Iranmedex, Web of Science and PubMed databases, and Google Scholar were searched. We focused on Studies in which Women with extramarital relationships, including heterosexual married couples-only studies and spirituality/religion and CBT as coping techniques used as EMI therapy. Finally, the full text of all eligible articles was prepared and discussed in this review. Results: 25 publications were identified, and their textual analysis facilitated through four thematic approaches: The nature of EMI in Women, the meaning of spirituality in the context of mental health and human behavior as well as psychotherapy; Spirituality integrated into Cognitive-Behavioral approach, The role of Spirituality as a deterrent to EMI. Conclusions: The integration of the findings discussed herein suggests that the application of cognitive and behavioral skills in addressing these kinds of destructive family-based relationships is inevitable. As treatments based on religion/spirituality or cognition/behavior do not seem adequately effective in dealing with EMI, the combination of these approaches may lead to higher efficacy in fewer sessions and a shorter time.

Keywords: spirituality, religion, cognitive behavioral therapy, extramarital relation, infidelity

Procedia PDF Downloads 252
486 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data

Authors: Chen Chou, Feng-Tyan Lin

Abstract:

Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.

Keywords: Big Data, ITS, influence range, living area, central place theory, visualization

Procedia PDF Downloads 276
485 Context and Culture in EFL Learners' and Native Speakers' Discourses

Authors: Emad A. S. Abu-Ayyash

Abstract:

Cohesive devices, the linguistic tools that are usually employed to hold the different parts of the text together, have been the focus of a significant number of discourse analysis studies. These linguistic tools have grabbed the attention of researchers since the inception of the first and most comprehensive model of cohesion in 1976. However, it was noticed that some cohesive devices (e.g., endophoric reference, conjunctions, ellipsis, substitution, and lexical ties) – being thought of as more popular than others (e.g., exophoric reference) – were over-researched. The present paper explores the usage of two cohesive devices that have been evidently almost absent from discourse analysis studies. These cohesive devices are exophoric and homophoric references, the linguistic items that can be interpreted in terms of the physical and cultural contexts of discourse. The significance of the current paper, therefore, stems from the fact that it attempts to fill a gap in the research conducted so far on cohesive devices. This study provides an explanation of the concepts of the cohesive devices that have been employed in a plethora of research on cohesion and elucidates the relevant context-related concepts. The paper also identifies the gap in cohesive devices research. Exophora and homophora, the least visited cohesive devices in previous studies, were qualitatively and quantitatively explored in six opinion articles, four produced by eight postgraduate English as a Foreign Language (EFL) students in a university in the United Arab Emirates and two by professional NS writers in the Independent and the Guardian. The six pieces were about the United Kingdom Independent Party (UKIP) leader’s call to ban the burqa in the UK and were analysed vis-a-vis the employment and function of homophora and exophora. The study found that both EFL students and native speakers employed exophora and homophora considerably in their writing to serve a variety of functions, including building assumptions, supporting main ideas, and involving the readers among others.

Keywords: cohesive devices, context, culture, exophoric reference, homophoric reference

Procedia PDF Downloads 120