Search results for: land cover classification
3752 Analysis of Pollution in Agriculture Land Using Decagon Em-50 and Rock Magnetism Method
Authors: Adinda Syifa Azhari, Eleonora Agustine, Dini Fitriani
Abstract:
This measurement has been done to analyze the impact of industrial pollution on the environment. Our research is to indicate the soil which has contained some pollution by industrial activity around the area, especially in Sumedang, West Java. The parameter phsyics such as total dissolved solid, volumetric water content, electrical conductivity bulk and FD have shown that the soil has polluted and measured by Decagon EM 50. Decagon EM 50 is one of the geophysical environment instrumentation that is used to interpret the soil condition. This experiment has given a result of these parameter physics, these are: Volumetric water content (m³/m³) = 0,154 – 0,384; Electrical Conductivity Bulk (dS/m) = 0,29 – 1,11 ; Dielectric Permittivity (DP) = 77,636 – 78, 339.Based on these data, we have got the conclusion that the area has, in fact, been contaminated by dangerous materials. VWC is parameter physics that has shown water in soil. The data show the pollution of the soil at the place, of which the specifications are PH, Total Dissolved Solid (TDS), Electrical Conductivity (EC) bigger (>>) and Frequency Dependent (FD) smaller (<<); that means the soil is alkali with big grain and has high salt concentration.Keywords: Decagon EM 50, electrical conductivity, industrial textiles, land, pollution
Procedia PDF Downloads 3813751 A Robust System for Foot Arch Type Classification from Static Foot Pressure Distribution Data Using Linear Discriminant Analysis
Authors: R. Periyasamy, Deepak Joshi, Sneh Anand
Abstract:
Foot posture assessment is important to evaluate foot type, causing gait and postural defects in all age groups. Although different methods are used for classification of foot arch type in clinical/research examination, there is no clear approach for selecting the most appropriate measurement system. Therefore, the aim of this study was to develop a system for evaluation of foot type as clinical decision-making aids for diagnosis of flat and normal arch based on the Arch Index (AI) and foot pressure distribution parameter - Power Ratio (PR) data. The accuracy of the system was evaluated for 27 subjects with age ranging from 24 to 65 years. Foot area measurements (hind foot, mid foot, and forefoot) were acquired simultaneously from foot pressure intensity image using portable PedoPowerGraph system and analysis of the image in frequency domain to obtain foot pressure distribution parameter - PR data. From our results, we obtain 100% classification accuracy of normal and flat foot by using the linear discriminant analysis method. We observe there is no misclassification of foot types because of incorporating foot pressure distribution data instead of only arch index (AI). We found that the mid-foot pressure distribution ratio data and arch index (AI) value are well correlated to foot arch type based on visual analysis. Therefore, this paper suggests that the proposed system is accurate and easy to determine foot arch type from arch index (AI), as well as incorporating mid-foot pressure distribution ratio data instead of physical area of contact. Hence, such computational tool based system can help the clinicians for assessment of foot structure and cross-check their diagnosis of flat foot from mid-foot pressure distribution.Keywords: arch index, computational tool, static foot pressure intensity image, foot pressure distribution, linear discriminant analysis
Procedia PDF Downloads 4993750 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction
Authors: Kefaya Qaddoum
Abstract:
Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.Keywords: tomato yield prediction, naive Bayes, redundancy, WSG
Procedia PDF Downloads 2373749 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods
Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno
Abstract:
Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management
Procedia PDF Downloads 4993748 Impacts of Tillage on Biodiversity of Microarthropod Communities in Two Different Crop Systems
Authors: Leila Ramezani, Mohammad Saeid Mossadegh
Abstract:
Different uses of land by humans alter the physico chemical characteristics of the soil and affect the soil microhabitat. The objective of this study was to evaluate the influence of tillage in three different human land uses on microarthropods biodiversity in Khuzestan province, southwest of Iran. Three microhabitats including a permanent grassland with old Date-Palms around and no till system, and two wheat fields, one with conservative agricultural practices and low till system and the other with conventional agricultural practices (deep tillage), were compared for the biodiversity of the two main groups of soil microarthropods (Oribatida and Collembola). Soil samples were collected from the top to a depth of 15 cm bimonthly during a period of two years. Significant differences in the biodiversity index of microarthropods were observed between the different tillage systems (F = 36.748, P =0.000). Indeed, analysis of species diversity showed that the diversity index at the conservative field with low till (2.58 ± 0.01) was higher (p < 0.05) than the conventional tilled field (2.45 ± 0.08) and the diversity of natural grassland was the highest (2.79 ± 0.19, p < 0.05). Indeed, the index of biodiversity and population abundance differed significantly in different seasons (p < 0.00).Keywords: biodiversity, Collembola, microarthropods, Oribatida
Procedia PDF Downloads 1753747 Distangling Biological Noise in Cellular Images with a Focus on Explainability
Authors: Manik Sharma, Ganapathy Krishnamurthi
Abstract:
The cost of some drugs and medical treatments has risen in recent years, that many patients are having to go without. A classification project could make researchers more efficient. One of the more surprising reasons behind the cost is how long it takes to bring new treatments to market. Despite improvements in technology and science, research and development continues to lag. In fact, finding new treatment takes, on average, more than 10 years and costs hundreds of millions of dollars. If successful, we could dramatically improve the industry's ability to model cellular images according to their relevant biology. In turn, greatly decreasing the cost of treatments and ensure these treatments get to patients faster. This work aims at solving a part of this problem by creating a cellular image classification model which can decipher the genetic perturbations in cell (occurring naturally or artificially). Another interesting question addressed is what makes the deep-learning model decide in a particular fashion, which can further help in demystifying the mechanism of action of certain perturbations and paves a way towards the explainability of the deep-learning model.Keywords: cellular images, genetic perturbations, deep-learning, explainability
Procedia PDF Downloads 1123746 Welfare Estimation in a General Equilibrium Model with Cities
Authors: Oded Hochman
Abstract:
We first show that current measures of welfare changes in the whole economy do not apply to an economy with cities. In addition, since such measures are defined over a partial equilibrium, they capture only partially the effect of a welfare change. We then define a unique and additive measure that we term the modified economic surplus (mES) which fully captures the welfare effects caused by a change in the price of a nationally traded good. We show that the price change causes, on the one hand a change of land rents in the economy and, on the other hand, an equal change of mES that can be estimated by measuring areas in the price-quantity national demand and supply plane. We construct for each city a cost function from which we derive a city’s and, after aggregation, an economy-wide demand and supply functions of nationwide prices and of either the unearned incomes (Marshalian functions) or the utility levels (compensated functions).Keywords: city cost function, welfare measures, modified compensated variation, modified economic surplus, unearned income function, differential land rents, city size
Procedia PDF Downloads 3203745 Detection and Classification of Rubber Tree Leaf Diseases Using Machine Learning
Authors: Kavyadevi N., Kaviya G., Gowsalya P., Janani M., Mohanraj S.
Abstract:
Hevea brasiliensis, also known as the rubber tree, is one of the foremost assets of crops in the world. One of the most significant advantages of the Rubber Plant in terms of air oxygenation is its capacity to reduce the likelihood of an individual developing respiratory allergies like asthma. To construct such a system that can properly identify crop diseases and pests and then create a database of insecticides for each pest and disease, we must first give treatment for the illness that has been detected. We shall primarily examine three major leaf diseases since they are economically deficient in this article, which is Bird's eye spot, algal spot and powdery mildew. And the recommended work focuses on disease identification on rubber tree leaves. It will be accomplished by employing one of the superior algorithms. Input, Preprocessing, Image Segmentation, Extraction Feature, and Classification will be followed by the processing technique. We will use time-consuming procedures that they use to detect the sickness. As a consequence, the main ailments, underlying causes, and signs and symptoms of diseases that harm the rubber tree are covered in this study.Keywords: image processing, python, convolution neural network (CNN), machine learning
Procedia PDF Downloads 763744 Classifications of Sleep Apnea (Obstructive, Central, Mixed) and Hypopnea Events Using Wavelet Packet Transform and Support Vector Machines (VSM)
Authors: Benghenia Hadj Abd El Kader
Abstract:
Sleep apnea events as obstructive, central, mixed or hypopnea are characterized by frequent breathing cessations or reduction in upper airflow during sleep. An advanced method for analyzing the patterning of biomedical signals to recognize obstructive sleep apnea and hypopnea is presented. In the aim to extract characteristic parameters, which will be used for classifying the above stated (obstructive, central, mixed) sleep apnea and hypopnea, the proposed method is based first on the analysis of polysomnography signals such as electrocardiogram signal (ECG) and electromyogram (EMG), then classification of the (obstructive, central, mixed) sleep apnea and hypopnea. The analysis is carried out using the wavelet transform technique in order to extract characteristic parameters whereas classification is carried out by applying the SVM (support vector machine) technique. The obtained results show good recognition rates using characteristic parameters.Keywords: obstructive, central, mixed, sleep apnea, hypopnea, ECG, EMG, wavelet transform, SVM classifier
Procedia PDF Downloads 3713743 Strategies for Drought Adpatation and Mitigation via Wastewater Management
Authors: Simrat Kaur, Fatema Diwan, Brad Reddersen
Abstract:
The unsustainable and injudicious use of natural renewable resources beyond the self-replenishment limits of our planet has proved catastrophic. Most of the Earth’s resources, including land, water, minerals, and biodiversity, have been overexploited. Owing to this, there is a steep rise in the global events of natural calamities of contrasting nature, such as torrential rains, storms, heat waves, rising sea levels, and megadroughts. These are all interconnected through common elements, namely oceanic currents and land’s the green cover. The deforestation fueled by the ‘economic elites’ or the global players have already cleared massive forests and ecological biomes in every region of the globe, including the Amazon. These were the natural carbon sinks prevailing and performing CO2 sequestration for millions of years. The forest biomes have been turned into mono cultivation farms to produce feedstock crops such as soybean, maize, and sugarcane; which are one of the biggest green house gas emitters. Such unsustainable agriculture practices only provide feedstock for livestock and food processing industries with huge carbon and water footprints. These are two main factors that have ‘cause and effect’ relationships in the context of climate change. In contrast to organic and sustainable farming, the mono-cultivation practices to produce food, fuel, and feedstock using chemicals devoid of the soil of its fertility, abstract surface, and ground waters beyond the limits of replenishment, emit green house gases, and destroy biodiversity. There are numerous cases across the planet where due to overuse; the levels of surface water reservoir such as the Lake Mead in Southwestern USA and ground water such as in Punjab, India, have deeply shrunk. Unlike the rain fed food production system on which the poor communities of the world relies; the blue water (surface and ground water) dependent mono-cropping for industrial and processed food create water deficit which put the burden on the domestic users. Excessive abstraction of both surface and ground waters for high water demanding feedstock (soybean, maize, sugarcane), cereal crops (wheat, rice), and cash crops (cotton) have a dual and synergistic impact on the global green house gas emissions and prevalence of megadroughts. Both these factors have elevated global temperatures, which caused cascading events such as soil water deficits, flash fires, and unprecedented burning of the woods, creating megafires in multiple continents, namely USA, South America, Europe, and Australia. Therefore, it is imperative to reduce the green and blue water footprints of agriculture and industrial sectors through recycling of black and gray waters. This paper explores various opportunities for successful implementation of wastewater management for drought preparedness in high risk communities.Keywords: wastewater, drought, biodiversity, water footprint, nutrient recovery, algae
Procedia PDF Downloads 1003742 Analyzing the Changing Pattern of Nigerian Vegetation Zones and Its Ecological and Socio-Economic Implications Using Spot-Vegetation Sensor
Authors: B. L. Gadiga
Abstract:
This study assesses the major ecological zones in Nigeria with the view to understanding the spatial pattern of vegetation zones and the implications on conservation within the period of sixteen (16) years. Satellite images used for this study were acquired from the SPOT-VEGETATION between 1998 and 2013. The annual NDVI images selected for this study were derived from SPOT-4 sensor and were acquired within the same season (November) in order to reduce differences in spectral reflectance due to seasonal variations. The images were sliced into five classes based on literatures and knowledge of the area (i.e. <0.16 Non-Vegetated areas; 0.16-0.22 Sahel Savannah; 0.22-0.40 Sudan Savannah, 0.40-0.47 Guinea Savannah and >0.47 Forest Zone). Classification of the 1998 and 2013 images into forested and non forested areas showed that forested area decrease from 511,691 km2 in 1998 to 478,360 km2 in 2013. Differencing change detection method was performed on 1998 and 2013 NDVI images to identify areas of ecological concern. The result shows that areas undergoing vegetation degradation covers an area of 73,062 km2 while areas witnessing some form restoration cover an area of 86,315 km2. The result also shows that there is a weak correlation between rainfall and the vegetation zones. The non-vegetated areas have a correlation coefficient (r) of 0.0088, Sahel Savannah belt 0.1988, Sudan Savannah belt -0.3343, Guinea Savannah belt 0.0328 and Forest belt 0.2635. The low correlation can be associated with the encroachment of the Sudan Savannah belt into the forest belt of South-eastern part of the country as revealed by the image analysis. The degradation of the forest vegetation is therefore responsible for the serious erosion problems witnessed in the South-east. The study recommends constant monitoring of vegetation and strict enforcement of environmental laws in the country.Keywords: vegetation, NDVI, SPOT-vegetation, ecology, degradation
Procedia PDF Downloads 2213741 Towards the Production of Least Contaminant Grade Biosolids and Biochar via Mild Acid Pre-treatment
Authors: Ibrahim Hakeem
Abstract:
Biosolids are stabilised sewage sludge produced from wastewater treatment processes. Biosolids contain valuable plant nutrient which facilitates their beneficial reuse in agricultural land. However, the increasing levels of legacy and emerging contaminants such as heavy metals (HMs), PFAS, microplastics, pharmaceuticals, microbial pathogens etc., are restraining the direct land application of biosolids. Pyrolysis of biosolids can effectively degrade microbial and organic contaminants; however, HMs remain a persistent problem with biosolids and their pyrolysis-derived biochar. In this work, we demonstrated the integrated processing of biosolids involving the acid pre-treatment for HMs removal and selective reduction of ash-forming elements followed by the bench-scale pyrolysis of the treated biosolids to produce quality biochar and bio-oil enriched with valuable platform chemicals. The pre-treatment of biosolids using 3% v/v H₂SO₄ at room conditions for 30 min reduced the ash content from 30 wt% in raw biosolids to 15 wt% in the treated sample while removing about 80% of limiting HMs without degrading the organic matter. The preservation of nutrients and reduction of HMs concentration and mobility via the developed hydrometallurgical process improved the grade of the treated biosolids for beneficial land reuse. The co-removal of ash-forming elements from biosolids positively enhanced the fluidised bed pyrolysis of the acid-treated biosolids at 700 ℃. Organic matter devolatilisation was improved by 40%, and the produced biochar had higher surface area (107 m²/g), heating value (15 MJ/kg), fixed carbon (35 wt%), organic carbon retention (66% dry-ash free) compared to the raw biosolids biochar with surface area (56 m²/g), heating value (9 MJ/kg), fixed carbon (20 wt%) and organic carbon retention (50%). Pre-treatment also improved microporous structure development of the biochar and substantially decreased the HMs concentration and bioavailability by at least 50% relative to the raw biosolids biochar. The integrated process is a viable approach to enhancing value recovery from biosolids.Keywords: biosolids, pyrolysis, biochar, heavy metals
Procedia PDF Downloads 763740 Intercropping Immature Oil Palm (Elaeisguineensis) with Banana, Ginger and Turmeric in Galle District, Sri Lanka
Authors: S. M. Dissanayake, I. R. Palihakkara , K. G. Premathilaka
Abstract:
Oil palm (Elaeisguineensis) is the world’s leading vegetable oil-producing plant and is well established as a perennial plantation crop in tropical countries. Oil palm in Sri Lanka has spread over 10,000 hectares in the wet zone of the Island. In immature plantations, land productivity can be increased with some selected intercrops. At the immature stage of the plantations (age up to 3-5 years), there is a large amount of free space available inside the plantations. This study attempts to determine the suitability of different intercrops during the immature phase of the oil palm. A field experiment is being conducted at Thalgaswella estate (WL2a) in Galle district, Sri Lanka. The objectives of the study are to evaluate and recommend a suitable immature oil palm-based intercropping system/s. This experiment was established with randomized complete block design (RCBD) with four treatments, including control in three replicates. Banana, ginger, and turmeric were selected as intercrops. Growth parameters of intercrops (plant height, length, width of D-leaf, and yield of intercrops) and girth, length, and number of leaflets of 17th frond in oil palms were taken at two months intervals. In addition to this, chlorophyll content was also measured in both intercrops and oil palm trees. Soil chemical parameters were measured annually. Results were statistically analyzed with SAS software. Results revealed that intercropped banana, turmeric, and ginger had given yields of 7.61Mt/ha, 4.92Mt/ha, and 4.53Mt/ha, respectively. When comparing these yields with mono-crop, banana, turmeric, and ginger intercrop yields as percentages of 16.9%, 24.6%, and 30.2%, respectively. The results of this study could be used to make appropriate policies to increase the unit land productivity in oil palm plantations in a low country wet zone (WL2a) of Sri Lanka.Keywords: inter-cropping, oil palm, policies, mono-crop, land productivity
Procedia PDF Downloads 1593739 The Illegal Architecture of Apartheid in Palestine
Authors: Hala Barakat
Abstract:
Architecture plays a crucial role in the colonization and organization of spaces, as well as the preservation of cultures and history. As a result of 70 years of occupation, Palestinian land, culture, and history are endangered today. The government of Israel has used architecture to strangulate Palestinians out and seize their land. The occupation has managed to fragment the West Bank and cause sensible scars on the landscape by creating obstacles, barriers, watchtowers, checkpoints, walls, apartheid roads, border devices, and illegal settlements to unjustly claim land from its indigenous population. The apartheid architecture has divided the Palestinian social and urban fabric into pieces, similarly to the Bantustans. The architectural techniques and methods used by the occupation are evidence of prejudice, and while the illegal settlements remain to be condemned by the United Nations, little is being done to officially end this apartheid. Illegal settlements range in scale from individual units to established cities and house more than 60,000 Israeli settlers that immigrated from all over Europe and the United States. Often architecture by Israel is being directed towards expressing ideologies and serving as evidence of its political agenda. More than 78% of what was granted to Palestine after the development of the Green Line in 1948 is under Israeli occupation today. This project aims to map the illegal architecture as a criticism of governmental agendas in the West Bank and Historic Palestinian land. The paper will also discuss the resistance to the newly developed plan for the last Arab village in Jerusalem, Lifta. The illegal architecture has isolated Palestinians from each other and installed obstacles to control their movement. The architecture of occupation has no ethical or humane logic but rather entirely political, administrative, and it should not be left for the silenced architecture to tell the story. Architecture is not being used as a connecting device but rather a way to implement political injustice and spatial oppression. By narrating stories of the architecture of occupation, we can highlight the spatial injustice of the complex apartheid infrastructure. The Israeli government has managed to intoxicate architecture to serve as a divider between cultural groups, allowing the unlawful and unethical architecture to define its culture and values. As architects and designers, the roles we play in the development of illegal settlements must align with the spatial ethics we practice. Most importantly, our profession is not performing architecturally when we design a house with a particular roof color to ensure it would not be mistaken with a Palestinian house and be attacked accidentally.Keywords: apartheid, illegal architecture, occupation, politics
Procedia PDF Downloads 1513738 Discrimination and Classification of Vestibular Neuritis Using Combined Fisher and Support Vector Machine Model
Authors: Amine Ben Slama, Aymen Mouelhi, Sondes Manoubi, Chiraz Mbarek, Hedi Trabelsi, Mounir Sayadi, Farhat Fnaiech
Abstract:
Vertigo is a sensation of feeling off balance; the cause of this symptom is very difficult to interpret and needs a complementary exam. Generally, vertigo is caused by an ear problem. Some of the most common causes include: benign paroxysmal positional vertigo (BPPV), Meniere's disease and vestibular neuritis (VN). In clinical practice, different tests of videonystagmographic (VNG) technique are used to detect the presence of vestibular neuritis (VN). The topographical diagnosis of this disease presents a large diversity in its characteristics that confirm a mixture of problems for usual etiological analysis methods. In this study, a vestibular neuritis analysis method is proposed with videonystagmography (VNG) applications using an estimation of pupil movements in the case of an uncontrolled motion to obtain an efficient and reliable diagnosis results. First, an estimation of the pupil displacement vectors using with Hough Transform (HT) is performed to approximate the location of pupil region. Then, temporal and frequency features are computed from the rotation angle variation of the pupil motion. Finally, optimized features are selected using Fisher criterion evaluation for discrimination and classification of the VN disease.Experimental results are analyzed using two categories: normal and pathologic. By classifying the reduced features using the Support Vector Machine (SVM), 94% is achieved as classification accuracy. Compared to recent studies, the proposed expert system is extremely helpful and highly effective to resolve the problem of VNG analysis and provide an accurate diagnostic for medical devices.Keywords: nystagmus, vestibular neuritis, videonystagmographic system, VNG, Fisher criterion, support vector machine, SVM
Procedia PDF Downloads 1363737 Water Balance in the Forest Basins Essential for the Water Supply in Central America
Authors: Elena Listo Ubeda, Miguel Marchamalo Sacristan
Abstract:
The demand for water doubles every twenty years, at a rate which is twice as fast as the world´s population growth. Despite it´s great importance, water is one of the most degraded natural resources in the world, mainly because of the reduction of natural vegetation coverage, population growth, contamination and changes in the soil use which reduces its capacity to collect water. This situation is especially serious in Central America, as reflected in the Human Development reports. The objective of this project is to assist in the improvement of water production and quality in Central America. In order to do these two watersheds in Costa Rica were selected as experiments: that of the Virilla-Durazno River, located in the extreme north east of the central valley which has an Atlantic influence; and that of the Jabillo River, which flows directly into the Pacific. The Virilla river watershed is located over andisols, and that of the Jabillo River is over alfisols, and both are of great importance for water supply to the Greater Metropolitan Area and the future tourist resorts respectively, as well as for the production of agriculture, livestock and hydroelectricity. The hydrological reaction in different soil-cover complexes, varying from the secondary forest to natural vegetation and degraded pasture, was analyzed according to the evaluation of the properties of the soil, infiltration, soil compaction, as well as the effects of the soil cover complex on erosion, calculated by the C factor of the Revised Universal Soil Loss Equation (RUSLE). A water balance was defined for each watershed, in which the volume of water that enters and leaves were estimated, as well as the evapotranspiration, runoff, and infiltration. Two future scenarios, representing the implementation of reforestation and deforestation plans, were proposed, and were analyzed for the effects of the soil cover complex on the water balance in each case. The results obtained show an increase of the ground water recharge in the humid forest areas, and an extension of the study of the dry areas is proposed since the ground water recharge here is diminishing. These results are of great significance for the planning, design of Payment Schemes for Environmental Services and the improvement of the existing water supply systems. In Central America spatial planning is a priority, as are the watersheds, in order to assess the water resource socially and economically, and securing its availability for the future.Keywords: Costa Rica, infiltration, soil, water
Procedia PDF Downloads 3843736 Machine Learning Techniques in Bank Credit Analysis
Authors: Fernanda M. Assef, Maria Teresinha A. Steiner
Abstract:
The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines
Procedia PDF Downloads 1033735 Research on Land Use Pattern and Employment-Housing Space of Coastal Industrial Town Based on the Investigation of Liaoning Province, China
Authors: Fei Chen, Wei Lu, Jun Cai
Abstract:
During the Twelve Five period, China promulgated industrial policies promoting the relocation of energy-intensive industries to coastal areas in order to utilize marine shipping resources. Consequently, some major state-owned steel and gas enterprises have relocated and resulted in a large-scale coastal area development. However, some land may have been over-exploited with seamless coastline projects. To balance between employment and housing, new industrial coastal towns were constructed to support the industrial-led development. In this paper, we adopt a case-study approach to closely examine the development of several new industrial coastal towns of Liaoning Province situated in the Bohai Bay area, which is currently under rapid economic growth. Our investigations reflect the common phenomenon of long distance commuting and a massive amount of vacant residences. More specifically, large plant relocation caused hundreds of kilometers of daily commute and enterprises had to provide housing subsidies and education incentives to motivate employees to relocate to coastal areas. Nonetheless, many employees still refuse to relocate due to job stability, diverse needs of family members and access to convenient services. These employees averaged 4 hours of commute daily and some who lived further had to reside in temporary industrial housing units and subject to long-term family separation. As a result, only a small portion of employees purchase new coastal residences but mostly for investment and retirement purposes, leading to massive vacancy and ghost-town phenomenon. In contrast to the low demand, coastal areas tend to develop large amount of residences prior to industrial relocation, which may be directly related to local government finances. Some local governments have sold residential land to developers to general revenue to support the subsequent industrial development. Subject to the strong preference of ocean-view, residential housing developers tend to select coast-line land to construct new residential towns, which further reduces the access of marine resources for major industrial enterprises. This violates the original intent of developing industrial coastal towns and drastically limits the availability of marine resources. Lastly, we analyze the co-existence of over-exploiting residential areas and massive vacancies in reference to the demand and supply of land, as well as the demand of residential housing units with the choice criteria of enterprise employees.Keywords: coastal industry town, commuter traffic, employment-housing space, outer suburb industrial area
Procedia PDF Downloads 2213734 Machine Learning Approach for Yield Prediction in Semiconductor Production
Authors: Heramb Somthankar, Anujoy Chakraborty
Abstract:
This paper presents a classification study on yield prediction in semiconductor production using machine learning approaches. A complicated semiconductor production process is generally monitored continuously by signals acquired from sensors and measurement sites. A monitoring system contains a variety of signals, all of which contain useful information, irrelevant information, and noise. In the case of each signal being considered a feature, "Feature Selection" is used to find the most relevant signals. The open-source UCI SECOM Dataset provides 1567 such samples, out of which 104 fail in quality assurance. Feature extraction and selection are performed on the dataset, and useful signals were considered for further study. Afterward, common machine learning algorithms were employed to predict whether the signal yields pass or fail. The most relevant algorithm is selected for prediction based on the accuracy and loss of the ML model.Keywords: deep learning, feature extraction, feature selection, machine learning classification algorithms, semiconductor production monitoring, signal processing, time-series analysis
Procedia PDF Downloads 1093733 Pattern Recognition Based on Simulation of Chemical Senses (SCS)
Authors: Nermeen El Kashef, Yasser Fouad, Khaled Mahar
Abstract:
No AI-complete system can model the human brain or behavior, without looking at the totality of the whole situation and incorporating a combination of senses. This paper proposes a Pattern Recognition model based on Simulation of Chemical Senses (SCS) for separation and classification of sign language. The model based on human taste controlling strategy. The main idea of the introduced model is motivated by the facts that the tongue cluster input substance into its basic tastes first, and then the brain recognizes its flavor. To implement this strategy, two level architecture is proposed (this is inspired from taste system). The separation-level of the architecture focuses on hand posture cluster, while the classification-level of the architecture to recognizes the sign language. The efficiency of proposed model is demonstrated experimentally by recognizing American Sign Language (ASL) data set. The recognition accuracy obtained for numbers of ASL is 92.9 percent.Keywords: artificial intelligence, biocybernetics, gustatory system, sign language recognition, taste sense
Procedia PDF Downloads 2943732 Pollutant Loads of Urban Runoff from a Mixed Residential-Commercial Catchment
Authors: Carrie Ho, Tan Yee Yong
Abstract:
Urban runoff quality for a mixed residential-commercial land use catchment in Miri, Sarawak was investigated for three storm events in 2011. Samples from the three storm events were tested for five water quality parameters, Namely, TSS, COD, BOD5, TP, and Pb. Concentration of the pollutants were found to vary significantly between storms, but were generally influenced by the length of antecedent dry period and the strength of rainfall intensities. Runoff from the study site showed a significant level of pollution for all the parameters investigated. Based on the National Water Quality Standards for Malaysia (NWQS), stormwater quality from the study site was polluted and exceeded class III water for TSS and BOD5 with maximum EMCs of 177 and 24 mg/L, respectively. Design pollutant load based on a design storm of 3-month average recurrence interval (ARI) for TSS, COD, BOD5, TP, and Pb were estimated to be 40, 9.4, 5.4, 1.7, and 0.06 kg/ha, respectively. The design pollutant load for the pollutants can be used to estimate loadings from similar catchments within Miri City.Keywords: mixed land-use, urban runoff, pollutant load, national water quality
Procedia PDF Downloads 3333731 Unearthing Air Traffic Control Officers Decision Instructional Patterns From Simulator Data for Application in Human Machine Teams
Authors: Zainuddin Zakaria, Sun Woh Lye
Abstract:
Despite the continuous advancements in automated conflict resolution tools, there is still a low rate of adoption of automation from Air Traffic Control Officers (ATCOs). Trust or acceptance in these tools and conformance to the individual ATCO preferences in strategy execution for conflict resolution are two key factors that impact their use. This paper proposes a methodology to unearth and classify ATCO conflict resolution strategies from simulator data of trained and qualified ATCOs. The methodology involves the extraction of ATCO executive control actions and the establishment of a system of strategy resolution classification based on ATCO radar commands and prevailing flight parameters in deconflicting a pair of aircraft. Six main strategies used to handle various categories of conflict were identified and discussed. It was found that ATCOs were about twice more likely to choose only vertical maneuvers in conflict resolution compared to horizontal maneuvers or a combination of both vertical and horizontal maneuvers.Keywords: air traffic control strategies, conflict resolution, simulator data, strategy classification system
Procedia PDF Downloads 1483730 Sustainable Crop Production: Greenhouse Gas Management in Farm Value Chain
Authors: Aswathaman Vijayan, Manish Jha, Ullas Theertha
Abstract:
Climate change and Global warming have become an issue for both developed and developing countries and perhaps the biggest threat to the environment. We at ITC Limited believe that a company’s performance must be measured by its Triple Bottom Line contribution to building economic, social and environmental capital. This Triple Bottom Line strategy focuses on - Embedding sustainability in business practices, Investing in social development and Adopting a low carbon growth path with a cleaner environment approach. The Agri Business Division - ILTD operates in the tobacco crop growing regions of Andhra Pradesh and Karnataka province of India. The Agri value chain of the company comprises of two distinct phases: First phase is Agricultural operations undertaken by ITC trained farmers and the second phase is Industrial operations which include marketing and processing of the agricultural produce. This research work covers the Greenhouse Gas (GHG) management strategy of ITC in the Agricultural operations undertaken by the farmers. The agriculture sector adds considerably to global GHG emissions through the use of carbon-based energies, use of fertilizers and other farming operations such as ploughing. In order to minimize the impact of farming operations on the environment, ITC has a taken a big leap in implementing system and process in reducing the GHG impact in farm value chain by partnering with the farming community. The company has undertaken a unique three-pronged approach for GHG management at the farm value chain: 1) GHG inventory at farm value chain: Different sources of GHG emission in the farm value chain were identified and quantified for the baseline year, as per the IPCC guidelines for greenhouse gas inventories. The major sources of emission identified are - emission due to nitrogenous fertilizer application during seedling production and main-field; emission due to diesel usage for farm machinery; emission due to fuel consumption and due to burning of crop residues. 2) Identification and implementation of technologies to reduce GHG emission: Various methodologies and technologies were identified for each GHG emission source and implemented at farm level. The identified methodologies are – reducing the consumption of chemical fertilizer usage at the farm through site-specific nutrient recommendation; Usage of sharp shovel for land preparation to reduce diesel consumption; implementation of energy conservation technologies to reduce fuel requirement and avoiding burning of crop residue by incorporation in the main field. These identified methodologies were implemented at farm level, and the GHG emission was quantified to understand the reduction in GHG emission. 3) Social and farm forestry for CO2 sequestration: In addition, the company encouraged social and farm forestry in the waste lands to convert it into green cover. The plantations are carried out with fast growing trees viz., Eucalyptus, Casuarina, and Subabul at the rate of 10,000 Ha of land per year. The above approach minimized considerable amount of GHG emission at the farm value chain benefiting farmers, community, and environment at a whole. In addition, the CO₂ stock created by social and farm forestry program has made the farm value chain to become environment-friendly.Keywords: CO₂ sequestration, farm value chain, greenhouse gas, ITC limited
Procedia PDF Downloads 2953729 Ten Patterns of Organizational Misconduct and a Descriptive Model of Interactions
Authors: Ali Abbas
Abstract:
This paper presents a descriptive model of organizational misconduct based on observed patterns that occur before and after an ethical collapse. The patterns were classified by categorizing media articles in both "for-profit" and "not-for-profit" organizations. Based on the model parameters, the paper provides a descriptive model of various organizational deflection strategies under numerous scenarios, including situations where ethical complaints build-up, situations under which whistleblowers become more prevalent, situations where large scandals that relate to leadership occur, and strategies by which organizations deflect blame when pressure builds up or when media finds out. The model parameters start with the premise of a tolerance to double standards in unethical acts when conducted by leadership or by members of corporate governance. Following this premise, the model explains how organizations engage in discursive strategies to cover up the potential conflicts that arise, including secret agreements and weakening stakeholders who may oppose the organizational acts. Deflection strategies include "preemptive" and "post-complaint" secret agreements, absence of (or vague) documented procedures, engaging in blame and scapegoating, remaining silent on complaints until the media finds out, as well as being slow (if at all) to acknowledge misconduct and fast to cover it up. The results of this paper may be used to guide organizational leaders into the implications of such shortsighted strategies toward unethical acts, even if they are deemed legal. Validation of the model assumptions through numerous media articles is provided.Keywords: ethical decision making, prediction, scandals, organizational strategies
Procedia PDF Downloads 1253728 Analysis of Sediment Distribution around Karang Sela Coral Reef Using Multibeam Backscatter
Authors: Razak Zakariya, Fazliana Mustajap, Lenny Sharinee Sakai
Abstract:
A sediment map is quite important in the marine environment. The sediment itself contains thousands of information that can be used for other research. This study was conducted by using a multibeam echo sounder Reson T20 on 15 August 2020 at the Karang Sela (coral reef area) at Pulau Bidong. The study aims to identify the sediment type around the coral reef by using bathymetry and backscatter data. The sediment in the study area was collected as ground truthing data to verify the classification of the seabed. A dry sieving method was used to analyze the sediment sample by using a sieve shaker. PDS 2000 software was used for data acquisition, and Qimera QPS version 2.4.5 was used for processing the bathymetry data. Meanwhile, FMGT QPS version 7.10 processes the backscatter data. Then, backscatter data were analyzed by using the maximum likelihood classification tool in ArcGIS version 10.8 software. The result identified three types of sediments around the coral which were very coarse sand, coarse sand, and medium sand.Keywords: sediment type, MBES echo sounder, backscatter, ArcGIS
Procedia PDF Downloads 873727 Rules in Policy Integration, Case Study: Victoria Catchment Management
Authors: Ratri Werdiningtyas, Yongping Wei, Andrew Western
Abstract:
This paper contributes to on-going attempts at bringing together land, water and environmental policy in catchment management. A tension remains in defining the boundaries of policy integration. Most of Integrated Water Resource Management is valued as rhetoric policy. It is far from being achieved on the ground because the socio-ecological system has not been understood and developed into complete and coherent problem representation. To clarify the feature of integration, this article draws on institutional fit for public policy integration and uses these insights in an empirical setting to identify the mechanism that can facilitate effective public integration for catchment management. This research is based on the journey of Victoria’s government from 1890-2016. A total of 274 Victorian Acts related to land, water, environment management published in those periods has been investigated. Four conditions of integration have been identified in their co-evolution: (1) the integration policy based on reserves, (2) the integration policy based on authority interest, (3) policy based on integrated information and, (4) policy based coordinated resource, authority and information. Results suggest that policy coordination among their policy instrument is superior rather than policy integration in the case of catchment management.Keywords: catchment management, co-evolution, policy integration, phase
Procedia PDF Downloads 2473726 Predicting Photovoltaic Energy Profile of Birzeit University Campus Based on Weather Forecast
Authors: Muhammad Abu-Khaizaran, Ahmad Faza’, Tariq Othman, Yahia Yousef
Abstract:
This paper presents a study to provide sufficient and reliable information about constructing a Photovoltaic energy profile of the Birzeit University campus (BZU) based on the weather forecast. The developed Photovoltaic energy profile helps to predict the energy yield of the Photovoltaic systems based on the weather forecast and hence helps planning energy production and consumption. Two models will be developed in this paper; a Clear Sky Irradiance model and a Cloud-Cover Radiation model to predict the irradiance for a clear sky day and a cloudy day, respectively. The adopted procedure for developing such models takes into consideration two levels of abstraction. First, irradiance and weather data were acquired by a sensory (measurement) system installed on the rooftop of the Information Technology College building at Birzeit University campus. Second, power readings of a fully operational 51kW commercial Photovoltaic system installed in the University at the rooftop of the adjacent College of Pharmacy-Nursing and Health Professions building are used to validate the output of a simulation model and to help refine its structure. Based on a comparison between a mathematical model, which calculates Clear Sky Irradiance for the University location and two sets of accumulated measured data, it is found that the simulation system offers an accurate resemblance to the installed PV power station on clear sky days. However, these comparisons show a divergence between the expected energy yield and actual energy yield in extreme weather conditions, including clouding and soiling effects. Therefore, a more accurate prediction model for irradiance that takes into consideration weather factors, such as relative humidity and cloudiness, which affect irradiance, was developed; Cloud-Cover Radiation Model (CRM). The equivalent mathematical formulas implement corrections to provide more accurate inputs to the simulation system. The results of the CRM show a very good match with the actual measured irradiance during a cloudy day. The developed Photovoltaic profile helps in predicting the output energy yield of the Photovoltaic system installed at the University campus based on the predicted weather conditions. The simulation and practical results for both models are in a very good match.Keywords: clear-sky irradiance model, cloud-cover radiation model, photovoltaic, weather forecast
Procedia PDF Downloads 1333725 Classification of Political Affiliations by Reduced Number of Features
Authors: Vesile Evrim, Aliyu Awwal
Abstract:
By the evolvement in technology, the way of expressing opinions switched the direction to the digital world. The domain of politics as one of the hottest topics of opinion mining research merged together with the behavior analysis for affiliation determination in text which constitutes the subject of this paper. This study aims to classify the text in news/blogs either as Republican or Democrat with the minimum number of features. As an initial set, 68 features which 64 are constituted by Linguistic Inquiry and Word Count (LIWC) features are tested against 14 benchmark classification algorithms. In the later experiments, the dimensions of the feature vector reduced based on the 7 feature selection algorithms. The results show that Decision Tree, Rule Induction and M5 Rule classifiers when used with SVM and IGR feature selection algorithms performed the best up to 82.5% accuracy on a given dataset. Further tests on a single feature and the linguistic based feature sets showed the similar results. The feature “function” as an aggregate feature of the linguistic category, is obtained as the most differentiating feature among the 68 features with 81% accuracy by itself in classifying articles either as Republican or Democrat.Keywords: feature selection, LIWC, machine learning, politics
Procedia PDF Downloads 3833724 Improving Fake News Detection Using K-means and Support Vector Machine Approaches
Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy
Abstract:
Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine
Procedia PDF Downloads 1763723 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation
Authors: Jonathan Gong
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning
Procedia PDF Downloads 130