Search results for: probabilistic classification vector machines
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3863

Search results for: probabilistic classification vector machines

503 Case Presentation Ectopic Cushing's Syndrome Secondary to Thymic Neuroendocrine Tumors Secreting ACTH

Authors: Hasan Frookh Jamal

Abstract:

This is a case of a 36-year-old Bahraini gentleman diagnosed to have Cushing's Syndrome with a large anterior mediastinal mass. He was sent abroad to the Speciality hospital in Jordan, where he underwent diagnostic video-assisted thoracoscopy, partial thymectomy and pericardial fat excision. Histopathology of the mass was reported to be an Atypical carcinoid tumor with a low Ki67 proliferation index of 5%, the mitotic activity of 4 MF/10HPF and pathological stage classification(pTNM): pT1aN1. MRI of the pituitary gland showed an ill-defined non-enhancing focus of about 3mm on the Rt side of the pituitary on coronal images, with a similar but smaller one on the left side, which could be due to enhancing pattern rather than a real lesion as reported. The patient underwent Ga68 Dotate PET/CT scan post-operatively, which showed multiple somatostatin receptor-positive lesions seen within the tail, body and head of the pancreas and positive somatostatin receptor lymph nodes located between the pancreatic head and IVC. There was no uptake detected at the anterior mediastinum nor at the site of thymic mass resection. There was no evidence of any positive somatostatin uptake at the soft tissue or lymph nodes. The patient underwent IPSS, which proved that the source is, in fact, an ectopic source of ACTH secretion. Unfortunately, the patient's serum cortisol remained elevated after surgery and failed to be suppressed by 1 mg ODST and by 2 days LLDST with a high ACTH value. The patient was started on Osilodrostat for treatment of hypercortisolism for the time being and his future treatment plan with Lutetium-177 Dotate therapy vs. bilateral adrenalectomy is to be considered in an MDT meeting.

Keywords: cushing syndrome, neuroendocrine tumur, carcinoid tumor, Thymoma

Procedia PDF Downloads 81
502 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 48
501 Understanding the Semantic Network of Tourism Studies in Taiwan by Using Bibliometrics Analysis

Authors: Chun-Min Lin, Yuh-Jen Wu, Ching-Ting Chung

Abstract:

The formulation of tourism policies requires objective academic research and evidence as support, especially research from local academia. Taiwan is a small island, and its economic growth relies heavily on tourism revenue. Taiwanese government has been devoting to the promotion of the tourism industry over the past few decades. Scientific research outcomes by Taiwanese scholars may and will help lay the foundations for drafting future tourism policy by the government. In this study, a total of 120 full journal articles published between 2008 and 2016 from the Journal of Tourism and Leisure Studies (JTSL) were examined to explore the scientific research trend of tourism study in Taiwan. JTSL is one of the most important Taiwanese journals in the tourism discipline which focuses on tourism-related issues and uses traditional Chinese as the study language. The method of co-word analysis from bibliometrics approaches was employed for semantic analysis in this study. When analyzing Chinese words and phrases, word segmentation analysis is a crucial step. It must be carried out initially and precisely in order to obtain meaningful word or word chunks for further frequency calculation. A word segmentation system basing on N-gram algorithm was developed in this study to conduct semantic analysis, and 100 groups of meaningful phrases with the highest recurrent rates were located. Subsequently, co-word analysis was employed for semantic classification. The results showed that the themes of tourism research in Taiwan in recent years cover the scope of tourism education, environmental protection, hotel management, information technology, and senior tourism. The results can give insight on the related issues and serve as a reference for tourism-related policy making and follow-up research.

Keywords: bibliometrics, co-word analysis, word segmentation, tourism research, policy

Procedia PDF Downloads 227
500 In situ Grazing Incidence Small Angle X-Ray Scattering Study of Permalloy Thin Film Growth on Nanorippled Si

Authors: Sarathlal Koyiloth Vayalil, Stephan V. Roth, Gonzalo Santoro, Peng Zhang, Matthias Schwartzkopf, Bjoern Beyersdorff

Abstract:

Nanostructured magnetic thin films have gained significant relevance due to its applications in magnetic storage and recording media. Self-organized arrays of nanoparticles and nanowires can be produced by depositing metal thin films on nano-rippled substrates. The substrate topography strongly affects the film growth giving rise to anisotropic properties (optical, magnetic, electronic transport). Ion-beam erosion (IBE) method can provide large-area patterned substrates with the valuable possibility to widely modify pattern length scale by simply acting on ion beam parameters (i.e. energy, ions, geometry, etc.). In this work, investigation of the growth mechanism of Permalloy thin films on such nano-rippled Si (100) substrates using in situ grazing incidence small angle x-ray scattering measurements (GISAXS) have been done. In situ GISAXS measurements during the deposition of thin films have been carried out at the P03/MiNaXS beam line of PETRA III storage ring of DESY, Hamburg. Nanorippled Si substrates prepared by low energy ion beam sputtering with an average wavelength of 33 nm and 1 nm have been used as templates. It has been found that the film replicates the morphology up to larger thickness regimes and also the growth is highly anisotropic along and normal to the ripple wave vectors. Various growth regimes have been observed. Further, magnetic measurements have been done using magneto-optical Kerr effect by rotating the sample in the azimuthal direction. Strong uniaxial magnetic anisotropy with its easy axis in a direction normal to the ripple wave vector has been observed. The strength of the magnetic anisotropy is found to be decreasing with increasing thin film thickness values. The mechanism of the observed strong uniaxial magnetic anisotropy and its depends on the thickness of the film has been explained by correlating it with the GISAXS results. In conclusion, we have done a detailed growth analysis of Permalloy thin films deposited on nanorippled Si templates and tried to explain the correlation between structure, morphology to the observed magnetic properties.

Keywords: grazing incidence small angle x-ray scattering, magnetic thin films, magnetic anisotropy, nanoripples

Procedia PDF Downloads 309
499 The Nimbārka School of Vedānta and the Indian Classical Dance: The Philosophical Relevance through Rasa Theory

Authors: Shubham Arora

Abstract:

This paper illustrates a relationship between the Dvaitādvaita (dualistic non-dualistic) doctrine of Nimbārka school of Vedānta and philosophy of Indian classical dance, through the Rasa theory. There would be a separate focus on the philosophies of both the disciplines and then analyzing Rasa theory as a connexion between them. The paper presents ideas regarding the similarity between the Brahman and the dancer, manifestation of enacting character and the Jīva (soul), the existence of the phenomenal world and the imaginary world classification of rasa on the basis of three modes of nature, and the feelings and expressions depicting the Dvaita and Advaita. The reason behind choosing such a topic is an intention to explore the relativity of the Vedantic philosophy of this school in real manner. It is really important to study the practical implications and relevance of the doctrine with other disciplines for perceiving it cogently. In our daily lives, we use various forms of facial expressions and bodily gestures in order to communicate, along with the oral and written means of communication. What if, when gestures and expressions mingle with the music beats, in order to present an idea? Indian Classical dance is highly rich in expressing the emotions using extraordinary expressions, unconventional bodily gestures and mesmerizing music beats. Ancient scriptures like Nāṭyaśāstra of Bharata Muni and Abhinava Bhārati by Abhinavaguptā recount aesthetics in a well-defined and structured way of acting and dancing and also reveal the grammar of rasa theory. Indian Classical dance is not only for entertainment but it is deeply in contact with divinity. During the period of Bhakti movement in India, this art form was used as a means to narrate the vignettes from epics like Rāmāyana and Mahābhārata and Purānas. Even in present era, this art has a deep rooted philosophy within.

Keywords: Advaita, Brahman, Dvaita, Jiva, Nimbarka, Rasa, Vedanta

Procedia PDF Downloads 298
498 Effectiveness of Cold Calling on Students’ Behavior and Participation during Class Discussions: Punishment or Opportunity to Shine

Authors: Maimuna Akram, Khadija Zia, Sohaib Naseer

Abstract:

Pedagogical objectives and the nature of the course content may lead instructors to take varied approaches to selecting a student for the cold call, specifically in a studio setup where students work on different projects independently and show progress work time to time at scheduled critiques. Cold-calling often proves to be an effective tool in eliciting a response without enforcing judgment onto the recipients. While there is a mixed range of behavior exhibited by students who are cold-called, a classification of responses from anxiety-provoking to inspiring may be elicited; there is a need for a greater understanding of utilizing the exchanges in bringing about fruitful and engaging outcomes of studio discussions. This study aims to unravel the dimensions of utilizing the cold-call approach in a didactic exchange within studio pedagogy. A questionnaire survey was conducted in an undergraduate class at Arts and Design School. The impact of cold calling on students’ participation was determined through various parameters, including course choice, participation frequency, students’ comfortability, and teaching methodology. After analyzing the surveys, specific classroom teachers were interviewed to provide a qualitative perspective of the faculty. It was concluded that cold-calling increases students’ participation frequency and also increases preparation for class. Around 67% of students responded that teaching methods play an important role in learning activities and students’ participation during class discussions. 84% of participants agreed that cold calling is an effective way of learning. According to research, cold-calling can be done in large numbers without making students uncomfortable. As a result, the findings of this study support the use of this instructional method to encourage more students to participate in class discussions.

Keywords: active learning, class discussion, class participation, cold calling, pedagogical methods, student engagement

Procedia PDF Downloads 34
497 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach

Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar

Abstract:

Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.

Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI

Procedia PDF Downloads 153
496 Impacts of Urbanization on Forest and Agriculture Areas in Savannakhet Province, Lao People's Democratic Republic

Authors: Chittana Phompila

Abstract:

The current increased population pushes increasing demands for natural resources and living space. In Laos, urban areas have been expanding rapidly in recent years. The rapid urbanization can have negative impacts on landscapes, including forest and agriculture lands. The primary objective of this research were to map current urban areas in a large city in Savannakhet province, in Laos, 2) to compare changes in urbanization between 1990 and 2018, and 3) to estimate forest and agriculture areas lost due to expansions of urban areas during the last over twenty years within study area. Landsat 8 data was used and existing GIS data was collected including spatial data on rivers, lakes, roads, vegetated areas and other land use/land covers). GIS data was obtained from the government sectors. Object based classification (OBC) approach was applied in ECognition for image processing and analysis of urban area using. Historical data from other Landsat instruments (Landsat 5 and 7) were used to allow us comparing changes in urbanization in 1990, 2000, 2010 and 2018 in this study area. Only three main land cover classes were focused and classified, namely forest, agriculture and urban areas. Change detection approach was applied to illustrate changes in built-up areas in these periods. Our study shows that the overall accuracy of map was 95% assessed, kappa~ 0.8. It is found that that there is an ineffective control over forest and land-use conversions from forests and agriculture to urban areas in many main cities across the province. A large area of agriculture and forest has been decreased due to this conversion. Uncontrolled urban expansion and inappropriate land use planning can lead to creating a pressure in our resource utilisation. As consequence, it can lead to food insecurity and national economic downturn in a long term.

Keywords: urbanisation, forest cover, agriculture areas, Landsat 8 imagery

Procedia PDF Downloads 157
495 Disclosure Extension of Oil and Gas Reserve Quantum

Authors: Ali Alsawayeh, Ibrahim Eldanfour

Abstract:

This paper examines the extent of disclosure of oil and gas reserve quantum in annual reports of international oil and gas exploration and production companies, particularly companies in untested international markets, such as Canada, the UK and the US, and seeks to determine the underlying factors that affect the level of disclosure on oil reserve quantum. The study is concerned with the usefulness of disclosure of oil and gas reserves quantum to investors and other users. Given the primacy of the annual report (10-k) as a source of supplemental reserves data about the company and as the channel through which companies disseminate information about their performance, the annual reports for one year (2009) were the central focus of the study. This comparative study seeks to establish whether differences exist between the sample companies, based on new disclosure requirements by the Securities and Exchange Commission (SEC) in respect of reserves classification and definition. The extent of disclosure of reserve is provided and compared among the selected companies. Statistical analysis is performed to determine whether any differences exist in the extent of disclosure of reserve under the determinant variables. This study shows that some factors would affect the extent of disclosure of reserve quantum in the above-mentioned countries, namely: company’s size, leverage and quality of auditor. Companies that provide reserves quantum in detail appear to display higher size. The findings also show that the level of leverage has affected companies’ reserves quantum disclosure. Indeed, companies that provide detailed reserves quantum disclosure tend to employ a ‘high-quality auditor’. In addition, the study found significant independent variable such as Profit Sharing Contracts (PSC). This factor could explain variations in the level of disclosure of oil reserve quantum between the contractor and host governments. The implementation of SEC oil and gas reporting requirements do not enhance companies’ valuation because the new rules are based only on past and present reserves information (proven reserves); hence, future valuation of oil and gas companies is missing for the market.

Keywords: comparison, company characteristics, disclosure, reserve quantum, regulation

Procedia PDF Downloads 404
494 Balanced Scorecard (BSC) Project : A Methodological Proposal for Decision Support in a Corporate Scenario

Authors: David de Oliveira Costa, Miguel Ângelo Lellis Moreira, Carlos Francisco Simões Gomes, Daniel Augusto de Moura Pereira, Marcos dos Santos

Abstract:

Strategic management is a fundamental process for global companies that intend to remain competitive in an increasingly dynamic and complex market. To do so, it is necessary to maintain alignment with their principles and values. The Balanced Scorecard (BSC) proposes to ensure that the overall business performance is based on different perspectives (financial, customer, internal processes, and learning and growth). However, relying solely on the BSC may not be enough to ensure the success of strategic management. It is essential that companies also evaluate and prioritize strategic projects that need to be implemented to ensure they are aligned with the business vision and contribute to achieving established goals and objectives. In this context, the proposition involves the incorporation of the SAPEVO-M multicriteria method to indicate the degree of relevance between different perspectives. Thus, the strategic objectives linked to these perspectives have greater weight in the classification of structural projects. Additionally, it is proposed to apply the concept of the Impact & Probability Matrix (I&PM) to structure and ensure that strategic projects are evaluated according to their relevance and impact on the business. By structuring the business's strategic management in this way, alignment and prioritization of projects and actions related to strategic planning are ensured. This ensures that resources are directed towards the most relevant and impactful initiatives. Therefore, the objective of this article is to present the proposal for integrating the BSC methodology, the SAPEVO-M multicriteria method, and the prioritization matrix to establish a concrete weighting of strategic planning and obtain coherence in defining strategic projects aligned with the business vision. This ensures a robust decision-making support process.

Keywords: MCDA process, prioritization problematic, corporate strategy, multicriteria method

Procedia PDF Downloads 80
493 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data

Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos

Abstract:

Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.

Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia

Procedia PDF Downloads 20
492 Application of Vector Representation for Revealing the Richness of Meaning of Facial Expressions

Authors: Carmel Sofer, Dan Vilenchik, Ron Dotsch, Galia Avidan

Abstract:

Studies investigating emotional facial expressions typically reveal consensus among observes regarding the meaning of basic expressions, whose number ranges between 6 to 15 emotional states. Given this limited number of discrete expressions, how is it that the human vocabulary of emotional states is so rich? The present study argues that perceivers use sequences of these discrete expressions as the basis for a much richer vocabulary of emotional states. Such mechanisms, in which a relatively small number of basic components is expanded to a much larger number of possible combinations of meanings, exist in other human communications modalities, such as spoken language and music. In these modalities, letters and notes, which serve as basic components of spoken language and music respectively, are temporally linked, resulting in the richness of expressions. In the current study, in each trial participants were presented with sequences of two images containing facial expression in different combinations sampled out of the eight static basic expressions (total 64; 8X8). In each trial, using single word participants were required to judge the 'state of mind' portrayed by the person whose face was presented. Utilizing word embedding methods (Global Vectors for Word Representation), employed in the field of Natural Language Processing, and relying on machine learning computational methods, it was found that the perceived meanings of the sequences of facial expressions were a weighted average of the single expressions comprising them, resulting in 22 new emotional states, in addition to the eight, classic basic expressions. An interaction between the first and the second expression in each sequence indicated that every single facial expression modulated the effect of the other facial expression thus leading to a different interpretation ascribed to the sequence as a whole. These findings suggest that the vocabulary of emotional states conveyed by facial expressions is not restricted to the (small) number of discrete facial expressions. Rather, the vocabulary is rich, as it results from combinations of these expressions. In addition, present research suggests that using word embedding in social perception studies, can be a powerful, accurate and efficient tool, to capture explicit and implicit perceptions and intentions. Acknowledgment: The study was supported by a grant from the Ministry of Defense in Israel to GA and CS. CS is also supported by the ABC initiative in Ben-Gurion University of the Negev.

Keywords: Glove, face perception, facial expression perception. , facial expression production, machine learning, word embedding, word2vec

Procedia PDF Downloads 175
491 The Types of Annuities with Flexible Premium

Authors: Deniz Ünal Özpalamutcu, Burcu Altman

Abstract:

Actuaria uses mathematics, statistic and financial information when analyzing the financial impacts of uncertainties, risks, insurance and pension related issues. In other words, it deals with the likelihood of potential risks, their financial impacts and especially the financial measures. Handling these measures require some long-term payment and investments. So, it is obvious it is inevitable to plan the periodic payments with equal time intervals considering also the changing value of money over time. These series of payment made specific intervals of time is called annuity or rant. In literature, rants are classified based on start and end dates, start times, payments times, payments amount or frequency. Classification of rants based on payment amounts changes based on the constant, descending or ascending payment methods. The literature about handling the annuity is very limited. Yet in a daily life, especially in today’s world where the economic issues gained a prominence, it is very crucial to use the variable annuity method in line with the demands of the customers. In this study, the types of annuities with flexible payment are discussed. In other words, we focus on calculating payment amount of a period by adding a certain percentage of previous period payment was studied. While studying this problem, formulas were created considering both start and end period payments for cash value and accumulated. Also increase of each period payment by r interest rate each period payments calculated with previous periods increases. And the problem of annuities (rants) of which each period payment increased with previous periods’ increase by r interest rate has been analyzed. Cash value and accumulated value calculation of this problem were studied separately based on the period start/end and their relations were expressed by formulas.

Keywords: actuaria, annuity, flexible payment, rant

Procedia PDF Downloads 219
490 Analysis of Influencing Factors on Infield-Logistics: A Survey of Different Farm Types in Germany

Authors: Michael Mederle, Heinz Bernhardt

Abstract:

The Management of machine fleets or autonomous vehicle control will considerably increase efficiency in future agricultural production. Especially entire process chains, e.g. harvesting complexes with several interacting combine harvesters, grain carts, and removal trucks, provide lots of optimization potential. Organization and pre-planning ensure to get these efficiency reserves accessible. One way to achieve this is to optimize infield path planning. Particularly autonomous machinery requires precise specifications about infield logistics to be navigated effectively and process optimized in the fields individually or in machine complexes. In the past, a lot of theoretical optimization has been done regarding infield logistics, mainly based on field geometry. However, there are reasons why farmers often do not apply the infield strategy suggested by mathematical route planning tools. To make the computational optimization more useful for farmers this study focuses on these influencing factors by expert interviews. As a result practice-oriented navigation not only to the field but also within the field will be possible. The survey study is intended to cover the entire range of German agriculture. Rural mixed farms with simple technology equipment are considered as well as large agricultural cooperatives which farm thousands of hectares using track guidance and various other electronic assistance systems. First results show that farm managers using guidance systems increasingly attune their infield-logistics on direction giving obstacles such as power lines. In consequence, they can avoid inefficient boom flippings while doing plant protection with the sprayer. Livestock farmers rather focus on the application of organic manure with its specific requirements concerning road conditions, landscape terrain or field access points. Cultivation of sugar beets makes great demands on infield patterns because of its particularities such as the row crop system or high logistics demands. Furthermore, several machines working in the same field simultaneously influence each other, regardless whether or not they are of the equal type. Specific infield strategies always are based on interactions of several different influences and decision criteria. Single working steps like tillage, seeding, plant protection or harvest mostly cannot be considered each individually. The entire production process has to be taken into consideration to detect the right infield logistics. One long-term objective of this examination is to integrate the obtained influences on infield strategies as decision criteria into an infield navigation tool. In this way, path planning will become more practical for farmers which is a basic requirement for automatic vehicle control and increasing process efficiency.

Keywords: autonomous vehicle control, infield logistics, path planning, process optimizing

Procedia PDF Downloads 232
489 Changing from Crude (Rudimentary) to Modern Method of Cassava Processing in the Ngwo Village of Njikwa Sub Division of North West Region of Cameroon

Authors: Loveline Ambo Angwah

Abstract:

The processing of cassava from tubers or roots into food using crude and rudimentary method (hand peeling, grating, frying and to sun drying) is a very cumbersome and difficult process. The crude methods are time consuming and labour intensive. While on the other hand, modern processing method, that is using machines to perform the various processes as washing, peeling, grinding, oven drying, fermentation and frying is easier, less time consuming, and less labour intensive. Rudimentarily, cassava roots are processed into numerous products and utilized in various ways according to local customs and preferences. For the people of Ngwo village, cassava is transformed locally into flour or powder form called ‘cumcum’. It is also sucked into water to give a kind of food call ‘water fufu’ and fried to give ‘garri’. The leaves are consumed as vegetables. Added to these, its relative high yields; ability to stay underground after maturity for long periods give cassava considerable advantage as a commodity that is being used by poor rural folks in the community, to fight poverty. It plays a major role in efforts to alleviate the food crisis because of its efficient production of food energy, year-round availability, tolerance to extreme stress conditions, and suitability to present farming and food systems in Africa. Improvement of cassava processing and utilization techniques would greatly increase labor efficiency, incomes, and living standards of cassava farmers and the rural poor, as well as enhance the-shelf life of products, facilitate their transportation, increase marketing opportunities, and help improve human and livestock nutrition. This paper presents a general overview of crude ways in cassava processing and utilization methods now used by subsistence and small-scale farmers in Ngwo village of the North West region in Cameroon, and examine the opportunities of improving processing technologies. Cassava needs processing because the roots cannot be stored for long because they rot within 3-4 days of harvest. They are bulky with about 70% moisture content, and therefore transportation of the tubers to markets is difficult and expensive. The roots and leaves contain varying amounts of cyanide which is toxic to humans and animals, while the raw cassava roots and uncooked leaves are not palatable. Therefore, cassava must be processed into various forms in order to increase the shelf life of the products, facilitate transportation and marketing, reduce cyanide content and improve palatability.

Keywords: cassava roots, crude ways, food system, poverty

Procedia PDF Downloads 166
488 A Study on Exploring Employees' Well-Being in Gaming Workplaces Prior to and after the Chinese Government Crackdowns on Corruption

Authors: Ying Chuan Wang, Zhang Tao

Abstract:

The aim of this article intends to explore the differences of well-being of employees in casino hotels before and after the Chinese government began to fight corruption. This researcher also attempted to find out the relationship between work pressure and well-being of employees in gambling workplaces before and after the Chinese government crackdowns the corruption. The category of well-being including life well-being, workplace well-being, and psychological well-being was included for analyzing well-being of employees in gaming workplaces. In addition, the psychological pressure classification was applied into this study and the Job Content Questionnaire (JCQ) would be adopted on investigating employees’ work pressure in terms of decision latitude, psychological demands, and workplace support. This study is a quantitative approach research and was conducted in March 2017. A purposive sampling was used in this study. A total of valid 339 responses were collected and the participants were casino hotel employees. The findings showed that decision latitude was significantly different prior to and after Chinese government crackdowns on corruption. Moreover, workplace support was strongly significantly related to employees’ well-being before Chinese government crackdowns. Decision latitude was strongly significantly related to employees’ well-being after Chinese government crackdowns. The findings suggest that employees’ work pressure affects their well being. In particular, because of workplace supports, it may alleviate employees’ work pressure and affect their perceptions of well-being but only prior to fighting the crackdowns. Importantly, decision latitude has become an essential factor affecting their well-being after the crackdown. It is finally hoped that the findings of this study provide suggestion to the managerial levels of hospitality industries. It is important to enhance employees’ decision latitude. Offering training courses to equip employees’ skills could be a possible way to reduce work pressure. In addition, establishing career path for the employees to pursuit is essential for their self-development and the improvement of well being. This would be crucial for casino hotels’ sustainable development and strengthening their competitiveness.

Keywords: well-being, work pressure, Casino hotels’ employees, gaming workplace

Procedia PDF Downloads 223
487 Characterization of a Lipolytic Enzyme of Pseudomonas nitroreducens Isolated from Mealworm's Gut

Authors: Jung-En Kuan, Whei-Fen Wu

Abstract:

In this study, a symbiotic bacteria from yellow mealworm's (Tenebrio molitor) mid-gut was isolated with characteristics of growth on minimal-tributyrin medium. After a PCR-amplification of its 16s rDNA, the resultant nucleotide sequences were then analyzed by schemes of the phylogeny trees. Accordingly, it was designated as Pseudomonas nitroreducens D-01. Next, by searching the lipolytic enzymes in its protein data bank, one of those potential lipolytic α/β hydrolases was identified, again using PCR-amplification and nucleotide-sequencing methods. To construct an expression of this lipolytic gene in plasmids, the target-gene primers were then designed, carrying the C-terminal his-tag sequences. Using the vector pET21a, a recombinant lipolytic hydrolase D gene with his-tag nucleotides was successfully cloned into it, of which the lipolytic D gene is under a control of the T7 promoter. After transformation of the resultant plasmids into Eescherichia coli BL21 (DE3), an IPTG inducer was used for the induction of the recombinant proteins. The protein products were then purified by metal-ion affinity column, and the purified proteins were found capable of forming a clear zone on tributyrin agar plate. Shortly, its enzyme activities were determined by degradation of p-nitrophenyl ester(s), and the substantial yellow end-product, p-nitrophenol, was measured at O.D.405 nm. Specifically, this lipolytic enzyme efficiently targets p-nitrophenyl butyrate. As well, it shows the most reactive activities at 40°C, pH 8 in potassium phosphate buffer. In thermal stability assays, the activities of this enzyme dramatically drop when the temperature is above 50°C. In metal ion assays, MgCl₂ and NH₄Cl induce the enzyme activities while MnSO₄, NiSO₄, CaCl₂, ZnSO₄, CoCl₂, CuSO₄, FeSO₄, and FeCl₃ reduce its activities. Besides, NaCl has no effects on its enzyme activities. Most organic solvents decrease the activities of this enzyme, such as hexane, methanol, ethanol, acetone, isopropanol, chloroform, and ethyl acetate. However, its enzyme activities increase when DMSO exists. All the surfactants like Triton X-100, Tween 80, Tween 20, and Brij35 decrease its lipolytic activities. Using Lineweaver-Burk double reciprocal methods, the function of the enzyme kinetics were determined such as Km = 0.488 (mM), Vmax = 0.0644 (mM/min), and kcat = 3.01x10³ (s⁻¹), as well the total efficiency of kcat/Km is 6.17 x10³ (mM⁻¹/s⁻¹). Afterwards, based on the phylogenetic analyses, this lipolytic protein is classified to type IV lipase by its homologous conserved region in this lipase family.

Keywords: enzyme, esterase, lipotic hydrolase, type IV

Procedia PDF Downloads 132
486 The Foundation Binary-Signals Mechanics and Actual-Information Model of Universe

Authors: Elsadig Naseraddeen Ahmed Mohamed

Abstract:

In contrast to the uncertainty and complementary principle, it will be shown in the present paper that the probability of the simultaneous occupation event of any definite values of coordinates by any definite values of momentum and energy at any definite instance of time can be described by a binary definite function equivalent to the difference between their numbers of occupation and evacuation epochs up to that time and also equivalent to the number of exchanges between those occupation and evacuation epochs up to that times modulus two, these binary definite quantities can be defined at all point in the time’s real-line so it form a binary signal represent a complete mechanical description of physical reality, the time of these exchanges represent the boundary of occupation and evacuation epochs from which we can calculate these binary signals using the fact that the time of universe events actually extends in the positive and negative of time’s real-line in one direction of extension when these number of exchanges increase, so there exists noninvertible transformation matrix can be defined as the matrix multiplication of invertible rotation matrix and noninvertible scaling matrix change the direction and magnitude of exchange event vector respectively, these noninvertible transformation will be called actual transformation in contrast to information transformations by which we can navigate the universe’s events transformed by actual transformations backward and forward in time’s real-line, so these information transformations will be derived as an elements of a group can be associated to their corresponded actual transformations. The actual and information model of the universe will be derived by assuming the existence of time instance zero before and at which there is no coordinate occupied by any definite values of momentum and energy, and then after that time, the universe begin its expanding in spacetime, this assumption makes the need for the existence of Laplace’s demon who at one moment can measure the positions and momentums of all constituent particle of the universe and then use the law of classical mechanics to predict all future and past of universe’s events, superfluous, we only need for the establishment of our analog to digital converters to sense the binary signals that determine the boundaries of occupation and evacuation epochs of the definite values of coordinates relative to its origin by the definite values of momentum and energy as present events of the universe from them we can predict approximately in high precision it's past and future events.

Keywords: binary-signal mechanics, actual-information model of the universe, actual-transformation, information-transformation, uncertainty principle, Laplace's demon

Procedia PDF Downloads 174
485 Analysis of Buddhist Rock Carvings in Diamer Basha Dam Reservoir Area, Gilgit-Baltistan, Pakistan

Authors: Abdul Ghani Khan

Abstract:

This paper focuses on the Buddhist rock carvings in the Diamer-Basha reservoir area, Gilgit-Baltistan, which is perhaps the largest rock art province of the world. The study region has thousands of rock carvings, particularly of the stupa carvings, engraved by artists, devotees or pilgrims, merchants have left their marks in the landscape or for the propagation of Buddhism. The Pak-German Archaeological Mission prepared, documented, and published the extensive catalogues of these carvings. Though, to date, very little systematic or statistically driven analysis was undertaken for in-depth understandings of the Buddhist rock carving tradition of the study region. This paper had made an attempt to examine stupa carvings and their constituent parts from the five selected sites, namely Oshibat, Shing Nala, Gichi Nala, Dadam Das, and Chilas Bridge. The statistical analyses and classification of the stupa carvings and their chronological contexts were carried out with the help of modern scientific tools such as STATA, FileMaker Pro, and MapSource softwares. The study had found that the tradition of stupa carvings on the surfaces of the rocks at the five selected sites continued for around 900 years, from the 1st century BCE to 8th century CE. There is a variation within the chronological settings of each of selected sites, possibly impacted by their utilization within particular landscapes, such as political (for example, change in political administrations or warfare) landscapes and geographical (for example, shifting of routes). The longer existence of the stupa carvings' tradition at these specific locations also indicates their central position on the trade and communication routes, and these were possibly also linked with religious ideologies within their particular times. The analyses of the different architectural elements of stupa carvings in the study area show that this tradition had structural similarities and differences in temporal and spatial contexts.

Keywords: rock carvings, stupa, stupa carvings, Buddhism, Pak-German archaeological mission

Procedia PDF Downloads 221
484 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia

Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman

Abstract:

Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.

Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh

Procedia PDF Downloads 225
483 Land Use Land Cover Changes in Response to Urban Sprawl within North-West Anatolia, Turkey

Authors: Melis Inalpulat, Levent Genc

Abstract:

In the present study, an attempt was made to state the Land Use Land Cover (LULC) transformation over three decades around the urban regions of Balıkesir, Bursa, and Çanakkale provincial centers (PCs) in Turkey. Landsat imageries acquired in 1984, 1999 and 2014 were used to determine the LULC change. Images were classified using the supervised classification technique and five main LULC classes were considered including forest (F), agricultural land (A), residential area (urban) - bare soil (R-B), water surface (W), and other (O). Change detection analyses were conducted for 1984-1999 and 1999-2014, and the results were evaluated. Conversions of LULC types to R-B class were investigated. In addition, population changes (1985-2014) were assessed depending on census data, the relations between population and the urban areas were stated, and future populations and urban area needs were forecasted for 2030. The results of LULC analysis indicated that urban areas, which are covered under R-B class, were expanded in all PCs. During 1984-1999 R-B class within Balıkesir, Bursa and Çanakkale PCs were found to have increased by 7.1%, 8.4%, and 2.9%, respectively. The trend continued in the 1999-2014 term and the increment percentages reached to 15.7%, 15.5%, and 10.2% at the end of 30-year period (1984-2014). Furthermore, since A class in all provinces was found to be the principal contributor for the R-B class, urban sprawl lead to the loss of agricultural lands. Moreover, the areas of R-B classes were highly correlated with population within all PCs (R2>0.992). Depending on this situation, both future populations and R-B class areas were forecasted. The estimated values of increase in the R-B class areas for Balıkesir, Bursa, and Çanakkale PCs were 1,586 ha, 7,999 ha and 854 ha, respectively. Due to this fact, the forecasted values for 2,030 are 7,838 ha, 27,866, and 2,486 ha for Balıkesir, Bursa, and Çanakkale, and thus, 7.7%, 8.2%, and 9.7% more R-B class areas are expected to locate in PCs in respect to the same order.

Keywords: landsat, LULC change, population, urban sprawl

Procedia PDF Downloads 261
482 Explanatory Variables for Crash Injury Risk Analysis

Authors: Guilhermina Torrao

Abstract:

An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.

Keywords: crash, exploratory, injury, risk, variables, vehicle

Procedia PDF Downloads 131
481 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models

Authors: R. Hellmuth

Abstract:

The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.

Keywords: building information modeling, digital factory model, factory planning, maintenance

Procedia PDF Downloads 109
480 Vibro-Tactile Equalizer for Musical Energy-Valence Categorization

Authors: Dhanya Nair, Nicholas Mirchandani

Abstract:

Musical haptic systems can enhance a listener’s musical experience while providing an alternative platform for the hearing impaired to experience music. Current music tactile technologies focus on representing tactile metronomes to synchronize performers or encoding musical notes into distinguishable (albeit distracting) tactile patterns. There is growing interest in the development of musical haptic systems to augment the auditory experience, although the haptic-music relationship is still not well understood. This paper represents a tactile music interface that provides vibrations to multiple fingertips in synchronicity with auditory music. Like an audio equalizer, different frequency bands are filtered out, and the power in each frequency band is computed and converted to a corresponding vibrational strength. These vibrations are felt on different fingertips, each corresponding to a different frequency band. Songs with music from different spectrums, as classified by their energy and valence, were used to test the effectiveness of the system and to understand the relationship between music and tactile sensations. Three participants were trained on one song categorized as sad (low energy and low valence score) and one song categorized as happy (high energy and high valence score). They were trained both with and without auditory feedback (listening to the song while experiencing the tactile music on their fingertips and then experiencing the vibrations alone without the music). The participants were then tested on three songs from both categories, without any auditory feedback, and were asked to classify the tactile vibrations they felt into either category. The participants were blinded to the songs being tested and were not provided any feedback on the accuracy of their classification. These participants were able to classify the music with 100% accuracy. Although the songs tested were on two opposite spectrums (sad/happy), the preliminary results show the potential of utilizing a vibrotactile equalizer, like the one presented, for augmenting musical experience while furthering the current understanding of music tactile relationship.

Keywords: haptic music relationship, tactile equalizer, tactile music, vibrations and mood

Procedia PDF Downloads 179
479 Value Index, a Novel Decision Making Approach for Waste Load Allocation

Authors: E. Feizi Ashtiani, S. Jamshidi, M.H Niksokhan, A. Feizi Ashtiani

Abstract:

Waste load allocation (WLA) policies may use multi-objective optimization methods to find the most appropriate and sustainable solutions. These usually intend to simultaneously minimize two criteria, total abatement costs (TC) and environmental violations (EV). If other criteria, such as inequity, need for minimization as well, it requires introducing more binary optimizations through different scenarios. In order to reduce the calculation steps, this study presents value index as an innovative decision making approach. Since the value index contains both the environmental violation and treatment costs, it can be maximized simultaneously with the equity index. It implies that the definition of different scenarios for environmental violations is no longer required. Furthermore, the solution is not necessarily the point with minimized total costs or environmental violations. This idea is testified for Haraz River, in north of Iran. Here, the dissolved oxygen (DO) level of river is simulated by Streeter-Phelps equation in MATLAB software. The WLA is determined for fish farms using multi-objective particle swarm optimization (MOPSO) in two scenarios. At first, the trade-off curves of TC-EV and TC-Inequity are plotted separately as the conventional approach. In the second, the Value-Equity curve is derived. The comparative results show that the solutions are in a similar range of inequity with lower total costs. This is due to the freedom of environmental violation attained in value index. As a result, the conventional approach can well be replaced by the value index particularly for problems optimizing these objectives. This reduces the process to achieve the best solutions and may find better classification for scenario definition. It is also concluded that decision makers are better to focus on value index and weighting its contents to find the most sustainable alternatives based on their requirements.

Keywords: waste load allocation (WLA), value index, multi objective particle swarm optimization (MOPSO), Haraz River, equity

Procedia PDF Downloads 421
478 Improvement of the Reliability and the Availability of a Production System

Authors: Lakhoua Najeh

Abstract:

Aims of the work: The aim of this paper is to improve the reliability and the availability of a Packer production line of cigarettes based on two methods: The SADT method (Structured Analysis Design Technique) and the FMECA approach (Failure Mode Effects and Critically Analysis). The first method enables us to describe the functionality of the Packer production line of cigarettes and the second method enables us to establish an FMECA analysis. Methods: The methodology adopted in order to contribute to the improvement of the reliability and the availability of a Packer production line of cigarettes has been proposed in this paper, and it is based on the use of Structured Analysis Design Technique (SADT) and Failure mode, effects, and criticality analysis (FMECA) methods. This methodology consists of using a diagnosis of the existing of all of the equipment of a production line of a factory in order to determine the most critical machine. In fact, we use, on the one hand, a functional analysis based on the SADT method of the production line and on the other hand, a diagnosis and classification of mechanical and electrical failures of the line production by their criticality analysis based on the FMECA approach. Results: Based on the methodology adopted in this paper, the results are the creation and the launch of a preventive maintenance plan. They contain the different elements of a Packer production line of cigarettes; the list of the intervention preventive activities and their period of realization. Conclusion: The diagnosis of the existing state helped us to found that the machine of cigarettes used in the Packer production line of cigarettes is the most critical machine in the factory. Then this enables us in the one hand, to describe the functionality of the production line of cigarettes by SADT method and on the other hand, to study the FMECA machine in order to improve the availability and the performance of this machine.

Keywords: production system, diagnosis, SADT method, FMECA method

Procedia PDF Downloads 140
477 Assessment of Waste Management Practices in Bahrain

Authors: T. Radu, R. Sreenivas, H. Albuflasa, A. Mustafa Khan, W. Aloqab

Abstract:

The Kingdom of Bahrain, a small island country in the Gulf region, is experiencing fast economic growth resulting in a sharp increase in population and greater than ever amounts of waste being produced. However, waste management in the country is still very basic, with landfilling being the most popular option. Recycling is still a scarce practice, with small recycling businesses and initiatives emerging in recent years. This scenario is typical for other countries in the region, with similar amounts of per capita waste being produced. In this paper, we are reviewing current waste management practices in Bahrain by collecting data published by the Government and various authors, and by visiting the country’s only landfill site, Askar. In addition, we have performed a survey of the residents to learn more about the awareness and attitudes towards sustainable waste management strategies. A review of the available data on waste management indicates that the Askar landfill site is nearing its capacity. The site uses open tipping as the method of disposal. The highest percentage of disposed waste comes from the building sector (38.4%), followed by domestic (27.5%) and commercial waste (17.9%). Disposal monitoring and recording are often based on estimates of weight and without proper characterization/classification of received waste. Besides, there is a need for assessment of the environmental impact of the site with systematic monitoring of pollutants in the area and their potential spreading to the surrounding land, groundwater, and air. The results of the survey indicate low awareness of what happens with the collected waste in the country. However, the respondents have shown support for future waste reduction and recycling initiatives. This implies that the education of local communities would be very beneficial for such governmental initiatives, securing greater participation. Raising awareness of issues surrounding recycling and waste management and systematic effort to divert waste from landfills are the first steps towards securing sustainable waste management in the Kingdom of Bahrain.

Keywords: landfill, municipal solid waste, survey, waste management

Procedia PDF Downloads 157
476 Examining Relationship between Resource-Curse and Under-Five Mortality in Resource-Rich Countries

Authors: Aytakin Huseynli

Abstract:

The paper reports findings of the study which examined under-five mortality rate among resource-rich countries. Typically when countries obtain wealth citizens gain increased wellbeing. Societies with new wealth create equal opportunities for everyone including vulnerable groups. But scholars claim that this is not the case for developing resource-rich countries and natural resources become the curse for them rather than the blessing. Spillovers from natural resource curse affect the social wellbeing of vulnerable people negatively. They get excluded from the mainstream society, and their situation becomes tangible. In order to test this hypothesis, the study compared under-5 mortality rate among resource-rich countries by using independent sample one-way ANOVA. The data on under-five mortality rate came from the World Bank. The natural resources for this study are oil, gas and minerals. The list of 67 resource-rich countries was taken from Natural Resource Governance Institute. The sample size was categorized and 4 groups were created such as low, low-middle, upper middle and high-income countries based on income classification of the World Bank. Results revealed that there was a significant difference in the scores for low, middle, upper-middle and high-income countries in under-five mortality rate (F(3(29.01)=33.70, p=.000). To find out the difference among income groups, the Games-Howell test was performed and it was found that infant mortality was an issue for low, middle and upper middle countries but not for high-income countries. Results of this study are in agreement with previous research on resource curse and negative effects of resource-based development. Policy implications of the study for social workers, policy makers, academicians and social development specialists are to raise and discuss issues of marginalization and exclusion of vulnerable groups in developing resource-rich countries and suggest interventions for avoiding them.

Keywords: children, natural resource, extractive industries, resource-based development, vulnerable groups

Procedia PDF Downloads 252
475 Methodological Deficiencies in Knowledge Representation Conceptual Theories of Artificial Intelligence

Authors: Nasser Salah Eldin Mohammed Salih Shebka

Abstract:

Current problematic issues in AI fields are mainly due to those of knowledge representation conceptual theories, which in turn reflected on the entire scope of cognitive sciences. Knowledge representation methods and tools are driven from theoretical concepts regarding human scientific perception of the conception, nature, and process of knowledge acquisition, knowledge engineering and knowledge generation. And although, these theoretical conceptions were themselves driven from the study of the human knowledge representation process and related theories; some essential factors were overlooked or underestimated, thus causing critical methodological deficiencies in the conceptual theories of human knowledge and knowledge representation conceptions. The evaluation criteria of human cumulative knowledge from the perspectives of nature and theoretical aspects of knowledge representation conceptions are affected greatly by the very materialistic nature of cognitive sciences. This nature caused what we define as methodological deficiencies in the nature of theoretical aspects of knowledge representation concepts in AI. These methodological deficiencies are not confined to applications of knowledge representation theories throughout AI fields, but also exceeds to cover the scientific nature of cognitive sciences. The methodological deficiencies we investigated in our work are: - The Segregation between cognitive abilities in knowledge driven models.- Insufficiency of the two-value logic used to represent knowledge particularly on machine language level in relation to the problematic issues of semantics and meaning theories. - Deficient consideration of the parameters of (existence) and (time) in the structure of knowledge. The latter requires that we present a more detailed introduction of the manner in which the meanings of Existence and Time are to be considered in the structure of knowledge. This doesn’t imply that it’s easy to apply in structures of knowledge representation systems, but outlining a deficiency caused by the absence of such essential parameters, can be considered as an attempt to redefine knowledge representation conceptual approaches, or if proven impossible; constructs a perspective on the possibility of simulating human cognition on machines. Furthermore, a redirection of the aforementioned expressions is required in order to formulate the exact meaning under discussion. This redirection of meaning alters the role of Existence and time factors to the Frame Work Environment of knowledge structure; and therefore; knowledge representation conceptual theories. Findings of our work indicate the necessity to differentiate between two comparative concepts when addressing the relation between existence and time parameters, and between that of the structure of human knowledge. The topics presented throughout the paper can also be viewed as an evaluation criterion to determine AI’s capability to achieve its ultimate objectives. Ultimately, we argue some of the implications of our findings that suggests that; although scientific progress may have not reached its peak, or that human scientific evolution has reached a point where it’s not possible to discover evolutionary facts about the human Brain and detailed descriptions of how it represents knowledge, but it simply implies that; unless these methodological deficiencies are properly addressed; the future of AI’s qualitative progress remains questionable.

Keywords: cognitive sciences, knowledge representation, ontological reasoning, temporal logic

Procedia PDF Downloads 111
474 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder

Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh

Abstract:

In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.

Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization

Procedia PDF Downloads 112