Search results for: calibration data requirements
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26434

Search results for: calibration data requirements

21244 Non-Parametric, Unconditional Quantile Estimation of Efficiency in Microfinance Institutions

Authors: Komlan Sedzro

Abstract:

We apply the non-parametric, unconditional, hyperbolic order-α quantile estimator to appraise the relative efficiency of Microfinance Institutions in Africa in terms of outreach. Our purpose is to verify if these institutions, which must constantly try to strike a compromise between their social role and financial sustainability are operationally efficient. Using data on African MFIs extracted from the Microfinance Information eXchange (MIX) database and covering the 2004 to 2006 periods, we find that more efficient MFIs are also the most profitable. This result is in line with the view that social performance is not in contradiction with the pursuit of excellent financial performance. Our results also show that large MFIs in terms of asset and those charging the highest fees are not necessarily the most efficient.

Keywords: data envelopment analysis, microfinance institutions, quantile estimation of efficiency, social and financial performance

Procedia PDF Downloads 287
21243 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology

Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani

Abstract:

Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.

Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography

Procedia PDF Downloads 411
21242 Whole Exome Sequencing Data Analysis of Rare Diseases: Non-Coding Variants and Copy Number Variations

Authors: S. Fahiminiya, J. Nadaf, F. Rauch, L. Jerome-Majewska, J. Majewski

Abstract:

Background: Sequencing of protein coding regions of human genome (Whole Exome Sequencing; WES), has demonstrated a great success in the identification of causal mutations for several rare genetic disorders in human. Generally, most of WES studies have focused on rare variants in coding exons and splicing-sites where missense substitutions lead to the alternation of protein product. Although focusing on this category of variants has revealed the mystery behind many inherited genetic diseases in recent years, a subset of them remained still inconclusive. Here, we present the result of our WES studies where analyzing only rare variants in coding regions was not conclusive but further investigation revealed the involvement of non-coding variants and copy number variations (CNV) in etiology of the diseases. Methods: Whole exome sequencing was performed using our standard protocols at Genome Quebec Innovation Center, Montreal, Canada. All bioinformatics analyses were done using in-house WES pipeline. Results: To date, we successfully identified several disease causing mutations within gene coding regions (e.g. SCARF2: Van den Ende-Gupta syndrome and SNAP29: 22q11.2 deletion syndrome) by using WES. In addition, we showed that variants in non-coding regions and CNV have also important value and should not be ignored and/or filtered out along the way of bioinformatics analysis on WES data. For instance, in patients with osteogenesis imperfecta type V and in patients with glucocorticoid deficiency, we identified variants in 5'UTR, resulting in the production of longer or truncating non-functional proteins. Furthermore, CNVs were identified as the main cause of the diseases in patients with metaphyseal dysplasia with maxillary hypoplasia and brachydactyly and in patients with osteogenesis imperfecta type VII. Conclusions: Our study highlights the importance of considering non-coding variants and CNVs during interpretation of WES data, as they can be the only cause of disease under investigation.

Keywords: whole exome sequencing data, non-coding variants, copy number variations, rare diseases

Procedia PDF Downloads 402
21241 Motor Gear Fault Diagnosis by Measurement of Current, Noise and Vibration on AC Machine

Authors: Sun-Ki Hong, Ki-Seok Kim, Yong-Ho Jo

Abstract:

Lots of motors have been being used in industry. Therefore many researchers have studied about the failure diagnosis of motors. In this paper, the effect of measuring environment for diagnosis of gear fault connected to a motor shaft is studied. The fault diagnosis is executed through the comparison of normal gear and abnormal gear. The measured FFT data are compared with the normal data and analyzed for q-axis current, noise and vibration. For bad and good environment, the diagnosis results are compared. From these, it is shown that the bad measuring environment may not be able to detect exactly the motor gear fault. Therefore it is emphasized that the measuring environment should be carefully prepared.

Keywords: motor fault, diagnosis, FFT, vibration, noise, q-axis current, measuring environment

Procedia PDF Downloads 539
21240 Design of Low-Emission Catalytically Stabilized Combustion Chamber Concept

Authors: Annapurna Basavaraju, Andreas Marn, Franz Heitmeir

Abstract:

The Advisory Council for Aeronautics Research in Europe (ACARE) is cognizant for the overall reduction of NOx emissions by 80% in its vision 2020. Moreover small turbo engines have higher fuel specific emissions compared to large engines due to their limited combustion chamber size. In order to fulfill these requirements, novel combustion concepts are essential. This motivates to carry out the research on the current state of art, catalytic stabilized combustion chamber using hydrogen in small jet engines which are designed and investigated both numerically and experimentally during this project. Catalytic combustion concepts can also be adopted for low caloric fuels and are therefore not constrained to only hydrogen. However, hydrogen has high heating value and has the major advantage of producing only the nitrogen oxides as pollutants during the combustion, thus eliminating the interest on other emissions such as Carbon monoxides etc. In the present work, the combustion chamber is designed based on the ‘Rich catalytic Lean burn’ concept. The experiments are conducted for the characteristic operating range of an existing engine. This engine has been tested successfully at Institute of Thermal Turbomachinery and Machine Dynamics (ITTM), Technical University Graz. One of the facts that the efficient combustion is a result of proper mixing of fuel-air mixture, considerable significance is given to the selection of appropriate mixer. This led to the design of three diverse configurations of mixers and is investigated experimentally and numerically. Subsequently the best mixer would be equipped in the main combustion chamber and used throughout the experimentation. Furthermore, temperatures and pressures would be recorded at various locations inside the combustion chamber and the exhaust emissions will also be analyzed. The instrumented combustion chamber would be inspected at the engine relevant inlet conditions for nine different sets of catalysts at the Hot Flow Test Facility (HFTF) of the institute.

Keywords: catalytic combustion, gas turbine, hydrogen, mixer, NOx emissions

Procedia PDF Downloads 294
21239 A Quantitative Structure-Adsorption Study on Novel and Emerging Adsorbent Materials

Authors: Marc Sader, Michiel Stock, Bernard De Baets

Abstract:

Considering a large amount of adsorption data of adsorbate gases on adsorbent materials in literature, it is interesting to predict such adsorption data without experimentation. A quantitative structure-activity relationship (QSAR) is developed to correlate molecular characteristics of gases and existing knowledge of materials with their respective adsorption properties. The application of Random Forest, a machine learning method, on a set of adsorption isotherms at a wide range of partial pressures and concentrations is studied. The predicted adsorption isotherms are fitted to several adsorption equations to estimate the adsorption properties. To impute the adsorption properties of desired gases on desired materials, leave-one-out cross-validation is employed. Extensive experimental results for a range of settings are reported.

Keywords: adsorption, predictive modeling, QSAR, random forest

Procedia PDF Downloads 213
21238 Application of Life Cycle Assessment “LCA” Approach for a Sustainable Building Design under Specific Climate Conditions

Authors: Djeffal Asma, Zemmouri Noureddine

Abstract:

In order for building designer to be able to balance environmental concerns with other performance requirements, they need clear and concise information. For certain decisions during the design process, qualitative guidance, such as design checklists or guidelines information may not be sufficient for evaluating the environmental benefits between different building materials, products and designs. In this case, quantitative information, such as that generated through a life cycle assessment, provides the most value. LCA provides a systematic approach to evaluating the environmental impacts of a product or system over its entire life. In the case of buildings life cycle includes the extraction of raw materials, manufacturing, transporting and installing building components or products, operating and maintaining the building. By integrating LCA into building design process, designers can evaluate the life cycle impacts of building design, materials, components and systems and choose the combinations that reduce the building life cycle environmental impact. This article attempts to give an overview of the integration of LCA methodology in the context of building design, and focuses on the use of this methodology for environmental considerations concerning process design and optimization. A multiple case study was conducted in order to assess the benefits of the LCA as a decision making aid tool during the first stages of the building design under specific climate conditions of the North East region of Algeria. It is clear that the LCA methodology can help to assess and reduce the impact of a building design and components on the environment even if the process implementation is rather long and complicated and lacks of global approach including human factors. It is also demonstrated that using LCA as a multi objective optimization of building process will certainly facilitates the improvement in design and decision making for both new design and retrofit projects.

Keywords: life cycle assessment, buildings, sustainability, elementary schools, environmental impacts

Procedia PDF Downloads 531
21237 A Comparative Study on the Dimensional Error of 3D CAD Model and SLS RP Model for Reconstruction of Cranial Defect

Authors: L. Siva Rama Krishna, Sriram Venkatesh, M. Sastish Kumar, M. Uma Maheswara Chary

Abstract:

Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modelling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyse the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.

Keywords: rapid prototyping, selective laser sintering, cranial defect, dimensional error

Procedia PDF Downloads 315
21236 The Transition from National Policy to Institutional Practice of Vietnamese English Language Teacher Education

Authors: Thi Phuong Lan Nguyen

Abstract:

The English Language Teacher Education (ELTE) in Vietnam is rapidly changing to address the new requirements of the globalization and socialization era. Although there has been a range of investments and innovation in policy and curriculum, tertiary educators and learners do not engage in the enactment. It is vital to understand the practices at the tertiary education level. The study is to understand the higher education curriculum development policy, both in theory and in practice across four representatives of ELTE institutions in the North of Vietnam. The lecturers’ perceptions about the extent to which the enacted curriculum is aligned with national standards will be explored. Nineteen policy documents, seventy surveys, and twelve interviews with lecturers and instructional leaders across these four Vietnamese Northern ELTE institutions have been analyzed to investigate how the policy shape the practice. The two most significant findings are (i) a low level of alignment between curriculum and soft-skills standards of the graduates required by the Vietnamese Ministry of Education and Training (MOET) and (ii) incoherence between current national policy and these institutions’ implementation. In order to address these gaps, it is strongly recommended that curriculum needs to be further developed, focusing more on the institutional outcomes, MOET’s standards, and the social demands in times of globalization. More importantly, professional development in ELTE is vital for a range of curriculum and educational policy stakeholders. The study helps to develop the English teaching profession in Vietnam in a systematic way, from policymakers to implementers, and from instructors to learners. Its significance lies in its relevance to English teaching careers, particularly within the researcher’s specific context, yet also remains relevant to ELTE in other parts of Vietnam and in other EFL (English as a Foreign Language) countries.

Keywords: curriculum, English language teaching education, policy implementation, standard, teaching practice

Procedia PDF Downloads 222
21235 Optimization of Doubly Fed Induction Generator Equivalent Circuit Parameters by Direct Search Method

Authors: Mamidi Ramakrishna Rao

Abstract:

Doubly-fed induction generator (DFIG) is currently the choice for many wind turbines. These generators, when connected to the grid through a converter, is subjected to varied power system conditions like voltage variation, frequency variation, short circuit fault conditions, etc. Further, many countries like Canada, Germany, UK, Scotland, etc. have distinct grid codes relating to wind turbines. Accordingly, following the network faults, wind turbines have to supply a definite reactive current. To satisfy the requirements including reactive current capability, an optimum electrical design becomes a mandate for DFIG to function. This paper intends to optimize the equivalent circuit parameters of an electrical design for satisfactory DFIG performance. Direct search method has been used for optimization of the parameters. The variables selected include electromagnetic core dimensions (diameters and stack length), slot dimensions, radial air gap between stator and rotor and winding copper cross section area. Optimization for 2 MW DFIG has been executed separately for three objective functions - maximum reactive power capability (Case I), maximum efficiency (Case II) and minimum weight (Case III). In the optimization analysis program, voltage variations (10%), power factor- leading and lagging (0.95), speeds for corresponding to slips (-0.3 to +0.3) have been considered. The optimum designs obtained for objective functions were compared. It can be concluded that direct search method of optimization helps in determining an optimum electrical design for each objective function like efficiency or reactive power capability or weight minimization.

Keywords: direct search, DFIG, equivalent circuit parameters, optimization

Procedia PDF Downloads 240
21234 Predicting Survival in Cancer: How Cox Regression Model Compares to Artifial Neural Networks?

Authors: Dalia Rimawi, Walid Salameh, Amal Al-Omari, Hadeel AbdelKhaleq

Abstract:

Predication of Survival time of patients with cancer, is a core factor that influences oncologist decisions in different aspects; such as offered treatment plans, patients’ quality of life and medications development. For a long time proportional hazards Cox regression (ph. Cox) was and still the most well-known statistical method to predict survival outcome. But due to the revolution of data sciences; new predication models were employed and proved to be more flexible and provided higher accuracy in that type of studies. Artificial neural network is one of those models that is suitable to handle time to event predication. In this study we aim to compare ph Cox regression with artificial neural network method according to data handling and Accuracy of each model.

Keywords: Cox regression, neural networks, survival, cancer.

Procedia PDF Downloads 178
21233 Survival and Hazard Maximum Likelihood Estimator with Covariate Based on Right Censored Data of Weibull Distribution

Authors: Al Omari Mohammed Ahmed

Abstract:

This paper focuses on Maximum Likelihood Estimator with Covariate. Covariates are incorporated into the Weibull model. Under this regression model with regards to maximum likelihood estimator, the parameters of the covariate, shape parameter, survival function and hazard rate of the Weibull regression distribution with right censored data are estimated. The mean square error (MSE) and absolute bias are used to compare the performance of Weibull regression distribution. For the simulation comparison, the study used various sample sizes and several specific values of the Weibull shape parameter.

Keywords: weibull regression distribution, maximum likelihood estimator, survival function, hazard rate, right censoring

Procedia PDF Downloads 425
21232 Size, Shape, and Compositional Effects on the Order-Disorder Phase Transitions in Au-Cu and Pt-M (M = Fe, Co, and Ni) Nanocluster Alloys

Authors: Forrest Kaatz, Adhemar Bultheel

Abstract:

Au-Cu and Pt-M (M = Fe, Co, and Ni) nanocluster alloys are currently being investigated worldwide by many researchers for their interesting catalytic and nanophase properties. The low-temperature behavior of the phase diagrams is not well understood for alloys with nanometer sizes and shapes. These systems have similar bulk phase diagrams with the L12 (Au3Cu, Pt3M, AuCu3, and PtM3) structurally ordered intermetallics and the L10 structure for the AuCu and PtM intermetallics. We consider three models for low temperature ordering in the phase diagrams of Au–Cu and Pt–M nanocluster alloys. These models are valid for sizes ~ 5 nm and approach bulk values for sizes ~ 20 nm. We study the phase transition in nanoclusters with cubic, octahedral, and cuboctahedral shapes, covering the compositions of interest. These models are based on studying the melting temperatures in nanoclusters using the regular solution, mixing model for alloys. Experimentally, it is extremely challenging to determine thermodynamic data on nano–sized alloys. Reasonable agreement is found between these models and recent experimental data on nanometer clusters in the Au–Cu and Pt–M nanophase systems. From our data, experiments on nanocubes about 5 nm in size, of stoichiometric AuCu and PtM composition, could help differentiate between the models. Some available evidence indicates that ordered intermetallic nanoclusters have better catalytic properties than disordered ones. We conclude with a discussion of physical mechanisms whereby ordering could improve the catalytic properties of nanocluster alloys.

Keywords: catalytic reactions, gold nanoalloys, phase transitions, platinum nanoalloys

Procedia PDF Downloads 156
21231 National Identity in Connecting the Community through Mural Art for Petronas Dagangan Berhad

Authors: Nadiah Mohamad, Wan Samiati Andriana Wan Mohd Daud, M. Suhaimi Tohid, Mohd Fazli Othman, Mohamad Rizal Salleh

Abstract:

This is a collaborative project of the mural art between The Department of Fine Art from Universiti Teknologi MARA (UiTM) and Petronas Dagangan Berhad (PDB), the most leading retailer and marketer of downstream oil and gas products in Malaysia. Five different states in the Peninsular of Malaysia that has been identified in showcasing the National Identity of Malaysia at each Petronas gas station, this also includes the Air Keroh in Melaka, Pasir Pekan in Kelantan, Pontian in Johor, Simpang Pulai in Perak, and also Wakaf Bharu in Terengganu. This project is to analyze the element of national identity that has been demonstrated at the Petronas's Mural. The ultimate aim of the mural is to let the community and local people to be aware about what Malaysians are consists and proud of and how everyone is able to connect with the idea through visual art. The method that is being explained in this research is by using visual data through research and also self-experience in collecting the visual data in identifying what images is considered as the national identity and idea development and visual analysis is being transferred based upon the visual data collection. In this stage, elements and principles of design will be the key in highlighting what is necessary for a work of art. In conclusion, visual image of the National Identity of Malaysia is able to connect to the audience from local and also to the people from outside the country to learn and understand the beauty and diversity of Malaysia as a unique country with art through the wall of five Petronas gas station.

Keywords: community, fine art, mural art, national identity

Procedia PDF Downloads 189
21230 Machine Learning Methods for Network Intrusion Detection

Authors: Mouhammad Alkasassbeh, Mohammad Almseidin

Abstract:

Network security engineers work to keep services available all the time by handling intruder attacks. Intrusion Detection System (IDS) is one of the obtainable mechanisms that is used to sense and classify any abnormal actions. Therefore, the IDS must be always up to date with the latest intruder attacks signatures to preserve confidentiality, integrity, and availability of the services. The speed of the IDS is a very important issue as well learning the new attacks. This research work illustrates how the Knowledge Discovery and Data Mining (or Knowledge Discovery in Databases) KDD dataset is very handy for testing and evaluating different Machine Learning Techniques. It mainly focuses on the KDD preprocess part in order to prepare a decent and fair experimental data set. The J48, MLP, and Bayes Network classifiers have been chosen for this study. It has been proven that the J48 classifier has achieved the highest accuracy rate for detecting and classifying all KDD dataset attacks, which are of type DOS, R2L, U2R, and PROBE.

Keywords: IDS, DDoS, MLP, KDD

Procedia PDF Downloads 225
21229 Colored Image Classification Using Quantum Convolutional Neural Networks Approach

Authors: Farina Riaz, Shahab Abdulla, Srinjoy Ganguly, Hajime Suzuki, Ravinesh C. Deo, Susan Hopkins

Abstract:

Recently, quantum machine learning has received significant attention. For various types of data, including text and images, numerous quantum machine learning (QML) models have been created and are being tested. Images are exceedingly complex data components that demand more processing power. Despite being mature, classical machine learning still has difficulties with big data applications. Furthermore, quantum technology has revolutionized how machine learning is thought of, by employing quantum features to address optimization issues. Since quantum hardware is currently extremely noisy, it is not practicable to run machine learning algorithms on it without risking the production of inaccurate results. To discover the advantages of quantum versus classical approaches, this research has concentrated on colored image data. Deep learning classification models are currently being created on Quantum platforms, but they are still in a very early stage. Black and white benchmark image datasets like MNIST and Fashion MINIST have been used in recent research. MNIST and CIFAR-10 were compared for binary classification, but the comparison showed that MNIST performed more accurately than colored CIFAR-10. This research will evaluate the performance of the QML algorithm on the colored benchmark dataset CIFAR-10 to advance QML's real-time applicability. However, deep learning classification models have not been developed to compare colored images like Quantum Convolutional Neural Network (QCNN) to determine how much it is better to classical. Only a few models, such as quantum variational circuits, take colored images. The methodology adopted in this research is a hybrid approach by using penny lane as a simulator. To process the 10 classes of CIFAR-10, the image data has been translated into grey scale and the 28 × 28-pixel image containing 10,000 test and 50,000 training images were used. The objective of this work is to determine how much the quantum approach can outperform a classical approach for a comprehensive dataset of color images. After pre-processing 50,000 images from a classical computer, the QCNN model adopted a hybrid method and encoded the images into a quantum simulator for feature extraction using quantum gate rotations. The measurements were carried out on the classical computer after the rotations were applied. According to the results, we note that the QCNN approach is ~12% more effective than the traditional classical CNN approaches and it is possible that applying data augmentation may increase the accuracy. This study has demonstrated that quantum machine and deep learning models can be relatively superior to the classical machine learning approaches in terms of their processing speed and accuracy when used to perform classification on colored classes.

Keywords: CIFAR-10, quantum convolutional neural networks, quantum deep learning, quantum machine learning

Procedia PDF Downloads 106
21228 Using Swarm Intelligence to Forecast Outcomes of English Premier League Matches

Authors: Hans Schumann, Colin Domnauer, Louis Rosenberg

Abstract:

In this study, machine learning techniques were deployed on real-time human swarm data to forecast the likelihood of outcomes for English Premier League matches in the 2020/21 season. These techniques included ensemble models in combination with neural networks and were tested against an industry standard of Vegas Oddsmakers. Predictions made from the collective intelligence of human swarm participants managed to achieve a positive return on investment over a full season on matches, empirically proving the usefulness of a new artificial intelligence valuing human instinct and intelligence.

Keywords: artificial intelligence, data science, English Premier League, human swarming, machine learning, sports betting, swarm intelligence

Procedia PDF Downloads 193
21227 Community Perception and Knowledge on Oral Cancer Screening Methods in Kuwait

Authors: Lavanya Dharmendran, Shenuka Singh, Sona Baburathanam

Abstract:

The aim of the study is to understand the level of awareness in a community of a specific region of Kuwait regarding oral cancer and its screening methods so as to enhance the uptake of oral cancer screening methods. This is a cross-sectional study comprising 100 adult participants residing in the governate of Farwaniya, Kuwait. Participants of above 18 years of both genders will be selected using convenience sampling. Data collection includes the administration of a self-administered questionnaire. The questionnaire comprises three sections, each section assessing the knowledge, attitudes and practices of the participants’ opinions about oral cancer and screening methods. Data will be analyzed using Humphris Oral Cancer Knowledge Scale. Inferential statistics will be done using Chi-Square or Fisher’s exact test for categorical data. A level of p<.05 will be established as being significant. All ethical considerations, such as respect for personal confidentiality and informed consent, will be applied in this study. This study revealed that although respondents were aware of the term oral cancer, more than half of the study participants were unaware of the symptoms associated with this condition. Smoking and alcohol were identified as risk factors for oral cancer, but the majority of participants did not identify the Human Papilloma Virus (HPV) as an added risk factor. This suggests a greater need for dental practitioners to include educational strategies in routine dental visits to ensure greater awareness of oral cancer.

Keywords: oral cancer, oral screening, oral public health, oral health

Procedia PDF Downloads 58
21226 Poverty Dynamics in Thailand: Evidence from Household Panel Data

Authors: Nattabhorn Leamcharaskul

Abstract:

This study aims to examine determining factors of the dynamics of poverty in Thailand by using panel data of 3,567 households in 2007-2017. Four techniques of estimation are employed to analyze the situation of poverty across households and time periods: the multinomial logit model, the sequential logit model, the quantile regression model, and the difference in difference model. Households are categorized based on their experiences into 5 groups, namely chronically poor, falling into poverty, re-entering into poverty, exiting from poverty and never poor households. Estimation results emphasize the effects of demographic and socioeconomic factors as well as unexpected events on the economic status of a household. It is found that remittances have positive impact on household’s economic status in that they are likely to lower the probability of falling into poverty or trapping in poverty while they tend to increase the probability of exiting from poverty. In addition, not only receiving a secondary source of household income can raise the probability of being a never poor household, but it also significantly increases household income per capita of the chronically poor and falling into poverty households. Public work programs are recommended as an important tool to relieve household financial burden and uncertainty and thus consequently increase a chance for households to escape from poverty.

Keywords: difference in difference, dynamic, multinomial logit model, panel data, poverty, quantile regression, remittance, sequential logit model, Thailand, transfer

Procedia PDF Downloads 94
21225 Factors Affecting Employee’s Effectiveness at Job in Banking Sectors of Pakistan

Authors: Sajid Aman

Abstract:

Jobs in the banking sector in Pakistan are perceived as very tough, due to which employee turnover is very high. However, the managerial role is very important in influencing employees’ attitudes toward their turnout. This paper explores the manager’s role in influencing employees’ effectiveness on the job. The paper adopted a pragmatic approach by combining both qualitative and quantitative data. The study employed an exploratory sequential strategy under a mixed-method research design. Qualitative data was analyzed using thematic analysis. Five major themes, such as the manager’s attitude towards employees, his leadership style, listening to employee’s personal problems, provision of personal loans without interest and future career prospects, emerged as key factors increasing employee’s effectiveness in the banking sector. The quantitative data revealed that a manager’s attitude, leadership style, availability to listen to employees’ personal problems, and future career prospects and listening to employee’s personal problems are strongly associated with employees’ effectiveness at the job. However, personal loan without interest was noted as having no significant association with employee’s effectiveness at the job. The study concludes manager’s role is more important in the effectiveness of the employees at their job in the banking sector. It is suggested that managers should have a positive attitude towards employees and give time to listening to employee’s problems, even personal ones.

Keywords: banking sector, employee’s effectiveness, manager’s role, leadership style

Procedia PDF Downloads 15
21224 Study and GIS Development of Geothermal Potential in South Algeria (Adrar Region)

Authors: A. Benatiallah, D. Benatiallah, F. Abaidi, B. Nasri, A. Harrouz, S. Mansouri

Abstract:

The region of Adrar is located in the south-western Algeria and covers a total area of 443.782 km², occupied by a population of 432,193 inhabitants. The main activity of population is agriculture, mainly based on the date palm cultivation occupies a total area of 23,532 ha. Adrar region climate is a continental desert characterized by a high variation in temperature between months (July, August) it exceeds 48°C and coldest months (December, January) with 16°C. Rainfall is very limited in frequency and volume with an aridity index of 4.6 to 5 which corresponds to a type of arid climate. Geologically Adrar region is located on the edge North West and is characterized by a Precambrian basement cover stolen sedimentary deposit of Phanerozoic age transgressive. The depression is filled by Touat site Paleozoic deposits (Cambrian to Namurian) of a vast sedimentary basin extending secondary age of the Saharan Atlas to the north hamada Tinhirt Tademaït and the plateau of south and Touat Gourara west to Gulf of Gabes in the Northeast. In this work we have study geothermal potential of Adrar region from the borehole data eatable in various sites across the area of 400,000 square kilometres; from these data we developed a GIS (Adrar_GIS) that plots data on the various points and boreholes in the region specifying information on available geothermal potential has variable depths.

Keywords: sig, geothermal, potenteil, temperature

Procedia PDF Downloads 454
21223 An Overview of the Wind and Wave Climate in the Romanian Nearshore

Authors: Liliana Rusu

Abstract:

The goal of the proposed work is to provide a more comprehensive picture of the wind and wave climate in the Romanian nearshore, using the results provided by numerical models. The Romanian coastal environment is located in the western side of the Black Sea, the more energetic part of the sea, an area with heavy maritime traffic and various offshore operations. Information about the wind and wave climate in the Romanian waters is mainly based on observations at Gloria drilling platform (70 km from the coast). As regards the waves, the measurements of the wave characteristics are not so accurate due to the method used, being also available for a limited period. For this reason, the wave simulations that cover large temporal and spatial scales represent an option to describe better the wave climate. To assess the wind climate in the target area spanning 1992–2016, data provided by the NCEP-CFSR (U.S. National Centers for Environmental Prediction - Climate Forecast System Reanalysis) and consisting in wind fields at 10m above the sea level are used. The high spatial and temporal resolution of the wind fields is good enough to represent the wind variability over the area. For the same 25-year period, as considered for the wind climate, this study characterizes the wave climate from a wave hindcast data set that uses NCEP-CFSR winds as input for a model system SWAN (Simulating WAves Nearshore) based. The wave simulation results with a two-level modelling scale have been validated against both in situ measurements and remotely sensed data. The second level of the system, with a higher resolution in the geographical space (0.02°×0.02°), is focused on the Romanian coastal environment. The main wave parameters simulated at this level are used to analyse the wave climate. The spatial distributions of the wind speed, wind direction and the mean significant wave height have been computed as the average of the total data. As resulted from the amount of data, the target area presents a generally moderate wave climate that is affected by the storm events developed in the Black Sea basin. Both wind and wave climate presents high seasonal variability. All the results are computed as maps that help to find the more dangerous areas. A local analysis has been also employed in some key locations corresponding to highly sensitive areas, as for example the main Romanian harbors.

Keywords: numerical simulations, Romanian nearshore, waves, wind

Procedia PDF Downloads 326
21222 Real-Time Water Quality Monitoring and Control System for Fish Farms Based on IoT

Authors: Nadia Yaghoobi, Seyed Majid Esmaeilzadeh

Abstract:

Due to advancements in wireless communication, new sensor capabilities have been created. In addition to the automation industry, the Internet of Things (IoT) has been used in environmental issues and has provided the possibility of communication between different devices for data collection and exchange. Water quality depends on many factors which are essential for maintaining the minimum sustainability of water. Regarding the great dependence of fishes on the quality of the aquatic environment, water quality can directly affect their activity. Therefore, monitoring water quality is an important issue to consider, especially in the fish farming industry. The conventional method of water quality testing is to collect water samples manually and send them to a laboratory for testing and analysis. This time-consuming method is a waste of manpower and is not cost-effective. The water quality measurement system implemented in this project monitors water quality in real-time through various sensors (parameters: water temperature, water level, dissolved oxygen, humidity and ambient temperature, water turbidity, PH). The Wi-Fi module, ESP8266, transmits data collected by sensors wirelessly to ThingSpeak and the smartphone app. Also, with the help of these instantaneous data, water temperature and water level can be controlled by using a heater and a water pump, respectively. This system can have a detailed study of the pollution and condition of water resources and can provide an environment for safe fish farming.

Keywords: dissolved oxygen, IoT, monitoring, ThingSpeak, water level, water quality, WiFi module

Procedia PDF Downloads 176
21221 Sustainability of Urban Affordable Housing in Malaysia

Authors: Lim Poh Im

Abstract:

This paper examines the current strategic and planning issues in the provision of affordable housing in urban centres in Malaysia from the perspective of sustainability. Sustainability here refers to social sustainability such as the need to address urban poverty and ensure better quality of life; economic sustainability in ensuring that the financial mechanisms are healthy and stable in the long-run, and to a lesser extent, environmental sustainability in reducing pollution related problems and building footprint. The Malaysian affordable housing sector has undergone tremendous transformations since the sixties, transcending from the earlier social housing catering to the poorer strata of the society, to the current state of housing woes plaguing the young urban middle class. The increase in urban land prices and construction costs, coupled with rampant property speculative and manipulative activities have resulted in situations of housing that are largely unaffordable even to the middle income sector of the urban populations. To overcome such scenario, the public as well as private sectors in the recent years, have came up with various intermediate, as well as medium-term policies aimed to curb the burning housing needs of the urban populations. Key strategies include financial intervention in regulating the interests rates, imposing property gain taxes; loosening the requirement for density and other planning requirements, faster approval of projects, compulsory contribution from developers, etc. Some of the policies are commendable, while others are ad-hoc by nature, and are not able to resolve the long-term socio-economic challenges. This paper discusses and examines the issues from the ‘sustainability’ perspective, focusing on key fiscal, land use and planning policies, as well as the more subtle (but important) political and institutional factors shaping the provision of mass housing for the urban populations in Malaysia.

Keywords: affordable housing, urban housing, sustainable housing, planning for urban housing

Procedia PDF Downloads 427
21220 Evaluating Classification with Efficacy Metrics

Authors: Guofan Shao, Lina Tang, Hao Zhang

Abstract:

The values of image classification accuracy are affected by class size distributions and classification schemes, making it difficult to compare the performance of classification algorithms across different remote sensing data sources and classification systems. Based on the term efficacy from medicine and pharmacology, we have developed the metrics of image classification efficacy at the map and class levels. The novelty of this approach is that a baseline classification is involved in computing image classification efficacies so that the effects of class statistics are reduced. Furthermore, the image classification efficacies are interpretable and comparable, and thus, strengthen the assessment of image data classification methods. We use real-world and hypothetical examples to explain the use of image classification efficacies. The metrics of image classification efficacy meet the critical need to rectify the strategy for the assessment of image classification performance as image classification methods are becoming more diversified.

Keywords: accuracy assessment, efficacy, image classification, machine learning, uncertainty

Procedia PDF Downloads 194
21219 Free Fatty Acid Assessment of Crude Palm Oil Using a Non-Destructive Approach

Authors: Siti Nurhidayah Naqiah Abdull Rani, Herlina Abdul Rahim, Rashidah Ghazali, Noramli Abdul Razak

Abstract:

Near infrared (NIR) spectroscopy has always been of great interest in the food and agriculture industries. The development of prediction models has facilitated the estimation process in recent years. In this study, 110 crude palm oil (CPO) samples were used to build a free fatty acid (FFA) prediction model. 60% of the collected data were used for training purposes and the remaining 40% used for testing. The visible peaks on the NIR spectrum were at 1725 nm and 1760 nm, indicating the existence of the first overtone of C-H bands. Principal component regression (PCR) was applied to the data in order to build this mathematical prediction model. The optimal number of principal components was 10. The results showed R2=0.7147 for the training set and R2=0.6404 for the testing set.

Keywords: palm oil, fatty acid, NIRS, regression

Procedia PDF Downloads 489
21218 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: computational social science, movie preference, machine learning, SVM

Procedia PDF Downloads 247
21217 Development of Environmentally Clean Construction Materials Using Industrial Waste from Kazakhstan

Authors: Galiya Zhanzakovna Alzhanova, Yelaman Kanatovich Aibuldinov, Zhanar Baktybaevna Iskakova, Gaziz Galymovich Abdiyussupov, Madi Toktasynuly Omirzak, Aizhan Doldashevna Gazizova

Abstract:

The sustainable use of industrial waste has recently increased due to increased environmental problems in landfills. One of the best ways to utilise waste is as a road base material. Industrial waste is a less costly and more efficient way to strengthen local soils than by introducing new additive materials. This study explored the feasibility of utilising red mud, blast furnace slag, and lime production waste to develop environmentally friendly construction materials for stabilising natural loam. Four different ratios of red mud (20, 30, and 40%), blast furnace slag (25, 30, and 35%), lime production waste (4, 6, and 8%), and varied amounts of natural loam were combined to produce nine different mixtures. The results showed that the sample with 40% red mud, 35% blast furnace slag, and 8% lime production waste had the highest strength. The sample's measured compressive strength for 90 days was 7.38 MPa, its water resistance for the same period was 7.12 MPa, and its frost resistance for the same period was 7.35 MP; low linear expansion met the requirements of the Kazakh regulations for first-class building materials. The study of mineral composition showed that there was no contamination with heavy metals or dangerous substances. Road base materials made of red mud, blast furnace slag, lime production waste, and natural loam mix can be employed because of their durability and environmental performance. The chemical and mineral composition of raw materials was determined using X-ray diffraction, X-ray fluorescence, scanning electron microscopy, energy dispersive spectroscopy, atomic absorption spectroscopy, and axial compressive strength were examined.

Keywords: blast furnace slag, lime production waste, natural loam stabilizing, red mud, road base material

Procedia PDF Downloads 84
21216 'Explainable Artificial Intelligence' and Reasons for Judicial Decisions: Why Justifications and Not Just Explanations May Be Required

Authors: Jacquelyn Burkell, Jane Bailey

Abstract:

Artificial intelligence (AI) solutions deployed within the justice system face the critical task of providing acceptable explanations for decisions or actions. These explanations must satisfy the joint criteria of public and professional accountability, taking into account the perspectives and requirements of multiple stakeholders, including judges, lawyers, parties, witnesses, and the general public. This research project analyzes and integrates two existing literature on explanations in order to propose guidelines for explainable AI in the justice system. Specifically, we review three bodies of literature: (i) explanations of the purpose and function of 'explainable AI'; (ii) the relevant case law, judicial commentary and legal literature focused on the form and function of reasons for judicial decisions; and (iii) the literature focused on the psychological and sociological functions of these reasons for judicial decisions from the perspective of the public. Our research suggests that while judicial ‘reasons’ (arguably accurate descriptions of the decision-making process and factors) do serve similar explanatory functions as those identified in the literature on 'explainable AI', they also serve an important ‘justification’ function (post hoc constructions that justify the decision that was reached). Further, members of the public are also looking for both justification and explanation in reasons for judicial decisions, and that the absence of either feature is likely to contribute to diminished public confidence in the legal system. Therefore, artificially automated judicial decision-making systems that simply attempt to document the process of decision-making are unlikely in many cases to be useful to and accepted within the justice system. Instead, these systems should focus on the post-hoc articulation of principles and precedents that support the decision or action, especially in cases where legal subjects’ fundamental rights and liberties are at stake.

Keywords: explainable AI, judicial reasons, public accountability, explanation, justification

Procedia PDF Downloads 111
21215 Understanding and Addressing the Tuberculosis Notification Gap in Nepal

Authors: Lok Raj Joshi, Naveen Prakash Shah, Sharad Kumar Sharma, I. Ratna Bhattarai, Rajendra Basnet, Deepak Dahal, Bahagwan Maharjan, Seraphine Kaminsa

Abstract:

Context: Tuberculosis (TB) is a significant health issue in Nepal, a country with a high burden of the disease. Despite efforts to control TB, there is still a gap in the notification of TB cases, which hinders effective control and treatment. This paper aims to address this notification gap and proposes strategies to improve TB control in Nepal. Research Aim: The aim of this research is to understand and address the tuberculosis notification gap in Nepal. The focus is on enhancing the healthcare system, involving the private sector and communities, raising awareness, and addressing social determinants to achieve sustainable TB control. Methodology: The research methodology involved a review of existing epidemiological data and research studies related to TB in Nepal. Additionally, consultation with an expert group from the TB control program in Nepal provided insights into the current state of TB control and challenges in addressing the notification gap. Findings: The findings reveal that only 55% of TB cases were reported in 2022, indicating a significant notification gap. Of the reported cases, only 32% and 19% were referred by the private sector and community, respectively. Furthermore, 20% of diagnosed cases were not treated in the initial phase. The estimated number of cases of multidrug-resistant TB (MDR TB) was 2,800, suggesting a low diagnosis rate. Among the diagnosed MDR TB cases, only 60% were receiving treatment. Additionally, it was observed that 20% of diagnosed MDR TB cases were from India and not enrolling in TB treatment in Nepal, indicating a high rate of defaulters. Theoretical Importance: The study highlights the importance of adopting a holistic strategy to address the notification gap in TB cases in Nepal. It emphasizes the need to enhance healthcare infrastructure, raise awareness, involve the private sector and local communities, establish effective methods to trace initial defaulters, implement TB interventions in border regions, and mitigate the social stigma associated with the disease. Data Collection and Analysis Procedures: Data for this study was collected through a review of existing epidemiological data and research studies. The data were then analyzed to identify patterns, trends, and gaps in TB case notification in Nepal.

Keywords: TB, tuberculosis, private sector, community, migrants, nepal

Procedia PDF Downloads 86