Search results for: data mining techniques
28335 Sentiment Analysis on the East Timor Accession Process to the ASEAN
Authors: Marcelino Caetano Noronha, Vosco Pereira, Jose Soares Pinto, Ferdinando Da C. Saores
Abstract:
One particularly popular social media platform is Youtube. It’s a video-sharing platform where users can submit videos, and other users can like, dislike or comment on the videos. In this study, we conduct a binary classification task on YouTube’s video comments and review from the users regarding the accession process of Timor Leste to become the eleventh member of the Association of South East Asian Nations (ASEAN). We scrape the data directly from the public YouTube video and apply several pre-processing and weighting techniques. Before conducting the classification, we categorized the data into two classes, namely positive and negative. In the classification part, we apply Support Vector Machine (SVM) algorithm. By comparing with Naïve Bayes Algorithm, the experiment showed SVM achieved 84.1% of Accuracy, 94.5% of Precision, and Recall 73.8% simultaneously.Keywords: classification, YouTube, sentiment analysis, support sector machine
Procedia PDF Downloads 10828334 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty
Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos
Abstract:
Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning
Procedia PDF Downloads 20828333 Allele Mining for Rice Sheath Blight Resistance by Whole-Genome Association Mapping in a Tail-End Population
Authors: Naoki Yamamoto, Hidenobu Ozaki, Taiichiro Ookawa, Youming Liu, Kazunori Okada, Aiping Zheng
Abstract:
Rice sheath blight is one of the destructive fungal diseases in rice. We have thought that rice sheath blight resistance is a polygenic trait. Host-pathogen interactions and secondary metabolites such as lignin and phytoalexins are likely to be involved in defense against R. solani. However, to our knowledge, it is still unknown how sheath blight resistance can be enhanced in rice breeding. To seek for an alternative genetic factor that contribute to sheath blight resistance, we mined relevant allelic variations from rice core collections created in Japan. Based on disease lesion length on detached leaf sheath, we selected 30 varieties of the top tail-end and the bottom tail-end, respectively, from the core collections to perform genome-wide association mapping. Re-sequencing reads for these varieties were used for calling single nucleotide polymorphisms among the 60 varieties to create a SNP panel, which contained 1,137,131 homozygous variant sites after filitering. Association mapping highlighted a locus on the long arm of chromosome 11, which is co-localized with three sheath blight QTLs, qShB11-2-TX, qShB11, and qSBR-11-2. Based on the localization of the trait-associated alleles, we identified an ankyryn repeat-containing protein gene (ANK-M) as an uncharacterized candidate factor for rice sheath blight resistance. Allelic distributions for ANK-M in the whole rice population supported the reliability of trait-allele associations. Gene expression characteristics were checked to evaluiate the functionality of ANK-M. Since an ANK-M homolog (OsPIANK1) in rice seems a basal defense regulator against rice blast and bacterial leaf blight, ANK-M may also play a role in the rice immune system.Keywords: allele mining, GWAS, QTL, rice sheath blight
Procedia PDF Downloads 7928332 Geostatistical Analysis of Contamination of Soils in an Urban Area in Ghana
Authors: S. K. Appiah, E. N. Aidoo, D. Asamoah Owusu, M. W. Nuonabuor
Abstract:
Urbanization remains one of the unique predominant factors which is linked to the destruction of urban environment and its associated cases of soil contamination by heavy metals through the natural and anthropogenic activities. These activities are important sources of toxic heavy metals such as arsenic (As), cadmium (Cd), chromium (Cr), copper (Cu), iron (Fe), manganese (Mn), and lead (Pb), nickel (Ni) and zinc (Zn). Often, these heavy metals lead to increased levels in some areas due to the impact of atmospheric deposition caused by their proximity to industrial plants or the indiscriminately burning of substances. Information gathered on potentially hazardous levels of these heavy metals in soils leads to establish serious health and urban agriculture implications. However, characterization of spatial variations of soil contamination by heavy metals in Ghana is limited. Kumasi is a Metropolitan city in Ghana, West Africa and is challenged with the recent spate of deteriorating soil quality due to rapid economic development and other human activities such as “Galamsey”, illegal mining operations within the metropolis. The paper seeks to use both univariate and multivariate geostatistical techniques to assess the spatial distribution of heavy metals in soils and the potential risk associated with ingestion of sources of soil contamination in the Metropolis. Geostatistical tools have the ability to detect changes in correlation structure and how a good knowledge of the study area can help to explain the different scales of variation detected. To achieve this task, point referenced data on heavy metals measured from topsoil samples in a previous study, were collected at various locations. Linear models of regionalisation and coregionalisation were fitted to all experimental semivariograms to describe the spatial dependence between the topsoil heavy metals at different spatial scales, which led to ordinary kriging and cokriging at unsampled locations and production of risk maps of soil contamination by these heavy metals. Results obtained from both the univariate and multivariate semivariogram models showed strong spatial dependence with range of autocorrelations ranging from 100 to 300 meters. The risk maps produced show strong spatial heterogeneity for almost all the soil heavy metals with extremely risk of contamination found close to areas with commercial and industrial activities. Hence, ongoing pollution interventions should be geared towards these highly risk areas for efficient management of soil contamination to avert further pollution in the metropolis.Keywords: coregionalization, heavy metals, multivariate geostatistical analysis, soil contamination, spatial distribution
Procedia PDF Downloads 30028331 Vehicular Speed Detection Camera System Using Video Stream
Authors: C. A. Anser Pasha
Abstract:
In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.Keywords: radar, image processing, detection, tracking, segmentation
Procedia PDF Downloads 46728330 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh
Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi
Abstract:
Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region
Procedia PDF Downloads 7728329 Towards Long-Range Pixels Connection for Context-Aware Semantic Segmentation
Authors: Muhammad Zubair Khan, Yugyung Lee
Abstract:
Deep learning has recently achieved enormous response in semantic image segmentation. The previously developed U-Net inspired architectures operate with continuous stride and pooling operations, leading to spatial data loss. Also, the methods lack establishing long-term pixels connection to preserve context knowledge and reduce spatial loss in prediction. This article developed encoder-decoder architecture with bi-directional LSTM embedded in long skip-connections and densely connected convolution blocks. The network non-linearly combines the feature maps across encoder-decoder paths for finding dependency and correlation between image pixels. Additionally, the densely connected convolutional blocks are kept in the final encoding layer to reuse features and prevent redundant data sharing. The method applied batch-normalization for reducing internal covariate shift in data distributions. The empirical evidence shows a promising response to our method compared with other semantic segmentation techniques.Keywords: deep learning, semantic segmentation, image analysis, pixels connection, convolution neural network
Procedia PDF Downloads 10228328 Wind Velocity Climate Zonation Based on Observation Data in Indonesia Using Cluster and Principal Component Analysis
Authors: I Dewa Gede Arya Putra
Abstract:
Principal Component Analysis (PCA) is a mathematical procedure that uses orthogonal transformation techniques to change a set of data with components that may be related become components that are not related to each other. This can have an impact on clustering wind speed characteristics in Indonesia. This study uses data daily wind speed observations of the Site Meteorological Station network for 30 years. Multicollinearity tests were also performed on all of these data before doing clustering with PCA. The results show that the four main components have a total diversity of above 80% which will be used for clusters. Division of clusters using Ward's method obtained 3 types of clusters. Cluster 1 covers the central part of Sumatra Island, northern Kalimantan, northern Sulawesi, and northern Maluku with the climatological pattern of wind speed that does not have an annual cycle and a weak speed throughout the year with a low-speed ranging from 0 to 1,5 m/s². Cluster 2 covers the northern part of Sumatra Island, South Sulawesi, Bali, northern Papua with the climatological pattern conditions of wind speed that have annual cycle variations with low speeds ranging from 1 to 3 m/s². Cluster 3 covers the eastern part of Java Island, the Southeast Nusa Islands, and the southern Maluku Islands with the climatological pattern of wind speed conditions that have annual cycle variations with high speeds ranging from 1 to 4.5 m/s².Keywords: PCA, cluster, Ward's method, wind speed
Procedia PDF Downloads 19528327 Effects of Plyometric Exercises on Agility, Power and Speed Improvement of U-17 Female Sprinters in Case of Burayu Athletics Project, Oromia, Ethiopia
Authors: Abdeta Bayissa Mekessa
Abstract:
The purpose of this study was to examine the effects of plyometric exercises on agility, power, and speed and improvement of U-17 female sprinters in the case of the Burayu Athletics project. The true experimental research design was employed for conducting this study. The total populations of the study were 14 U-17 female sprinters from Burayu athletics project. The populations were small in numbers; therefore, the researcher took all as a sample by using comprehensive sampling techniques. These subjects were classified into the Experimental group (N=7) and the Control group (N=7) by using simple random sampling techniques. The Experimental group participated in plyometric training for 8 weeks, 3 days per week and 60 minutes duration per day in addition to their regular training. But, the control groups were following their only regular training program. The variables selected for the purpose of this study were agility, power and speed. The tests were the Illinois agility test, standing long jump test, and 30m sprint test, respectively. Both groups were tested before (pre-test) and after (post-test) 8 weeks of plyometric training. For data analysis, the researcher used SPSS version 26.0 software. The collected data was analyzed using a paired sample t-test to observe the difference between the pre-test and post-test results of the plyometric exercises of the study. The significant level of p<0.05 was considered. The result of the study shows that after 8 weeks of plyometric training, significant improvements were found in Agility (MD=0.45, p<0.05), power (MD=-1.157, P<0.05) and speed (MD=0.37, P<0.05) for experimental group subjects. On the other hand, there was no significant change (P>0.05) in those variables in the control groups. Finally, the findings of the study showed that eight (8) weeks of plyometric exercises had a positive effect on agility, power and speed improvement of female sprinters. Therefore, Athletics coaches and athletes are highly recommended to include plyometric exercise in their training program.Keywords: ploymetric exercise, speed power, aglity, female sprinter
Procedia PDF Downloads 3828326 Using Photogrammetric Techniques to Map the Mars Surface
Authors: Ahmed Elaksher, Islam Omar
Abstract:
For many years, Mars surface has been a mystery for scientists. Lately with the help of geospatial data and photogrammetric procedures researchers were able to capture some insights about this planet. Two of the most imperative data sources to explore Mars are the The High Resolution Imaging Science Experiment (HiRISE) and the Mars Orbiter Laser Altimeter (MOLA). HiRISE is one of six science instruments carried by the Mars Reconnaissance Orbiter, launched August 12, 2005, and managed by NASA. The MOLA sensor is a laser altimeter carried by the Mars Global Surveyor (MGS) and launched on November 7, 1996. In this project, we used MOLA-based DEMs to orthorectify HiRISE optical images for generating a more accurate and trustful surface of Mars. The MOLA data was interpolated using the kriging interpolation technique. Corresponding tie points were digitized from both datasets. These points were employed in co-registering both datasets using GIS analysis tools. In this project, we employed three different 3D to 2D transformation models. These are the parallel projection (3D affine) transformation model; the extended parallel projection transformation model; the Direct Linear Transformation (DLT) model. A set of tie-points was digitized from both datasets. These points were split into two sets: Ground Control Points (GCPs), used to evaluate the transformation parameters using least squares adjustment techniques, and check points (ChkPs) to evaluate the computed transformation parameters. Results were evaluated using the RMSEs between the precise horizontal coordinates of the digitized check points and those estimated through the transformation models using the computed transformation parameters. For each set of GCPs, three different configurations of GCPs and check points were tested, and average RMSEs are reported. It was found that for the 2D transformation models, average RMSEs were in the range of five meters. Increasing the number of GCPs from six to ten points improve the accuracy of the results with about two and half meters. Further increasing the number of GCPs didn’t improve the results significantly. Using the 3D to 2D transformation parameters provided three to two meters accuracy. Best results were reported using the DLT transformation model. However, increasing the number of GCPS didn’t have substantial effect. The results support the use of the DLT model as it provides the required accuracy for ASPRS large scale mapping standards. However, well distributed sets of GCPs is a key to provide such accuracy. The model is simple to apply and doesn’t need substantial computations.Keywords: mars, photogrammetry, MOLA, HiRISE
Procedia PDF Downloads 5728325 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output
Procedia PDF Downloads 5728324 Groundwater Treatment of Thailand's Mae Moh Lignite Mine
Authors: A. Laksanayothin, W. Ariyawong
Abstract:
Mae Moh Lignite Mine is the largest open-pit mine in Thailand. The mine serves coal to the power plant about 16 million tons per year. This amount of coal can produce electricity accounting for about 10% of Nation’s electric power generation. The mining area of Mae Moh Mine is about 28 km2. At present, the deepest area of the pit is about 280 m from ground level (+40 m. MSL) and in the future the depth of the pit can reach 520 m from ground level (-200 m.MSL). As the size of the pit is quite large, the stability of the pit is seriously important. Furthermore, the preliminary drilling and extended drilling in year 1989-1996 had found high pressure aquifer under the pit. As a result, the pressure of the underground water has to be released in order to control mine pit stability. The study by the consulting experts later found that 3-5 million m3 per year of the underground water is needed to be de-watered for the safety of mining. However, the quality of this discharged water should meet the standard. Therefore, the ground water treatment facility has been implemented, aiming to reduce the amount of naturally contaminated Arsenic (As) in discharged water lower than the standard limit of 10 ppb. The treatment system consists of coagulation and filtration process. The main components include rapid mixing tanks, slow mixing tanks, sedimentation tank, thickener tank and sludge drying bed. The treatment process uses 40% FeCl3 as a coagulant. The FeCl3 will adsorb with As(V), forming floc particles and separating from the water as precipitate. After that, the sludge is dried in the sand bed and then be disposed in the secured land fill. Since 2011, the treatment plant of 12,000 m3/day has been efficiently operated. The average removal efficiency of the process is about 95%.Keywords: arsenic, coagulant, ferric chloride, groundwater, lignite, coal mine
Procedia PDF Downloads 31028323 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 2028322 The Fundamental Research and Industrial Application on CO₂+O₂ in-situ Leaching Process in China
Authors: Lixin Zhao, Genmao Zhou
Abstract:
Traditional acid in-situ leaching (ISL) is not suitable for the sandstone uranium deposit with low permeability and high content of carbonate minerals, because of the blocking of calcium sulfate precipitates. Another factor influences the uranium acid in-situ leaching is that the pyrite in ore rocks will react with oxidation reagent and produce lots of sulfate ions which may speed up the precipitation process of calcium sulphate and consume lots of oxidation reagent. Due to the advantages such as less chemical reagent consumption and groundwater pollution, CO₂+O₂ in-situ leaching method has become one of the important research areas in uranium mining. China is the second country where CO₂+O₂ ISL has been adopted in industrial uranium production of the world. It is shown that the CO₂+O₂ ISL in China has been successfully developed. The reaction principle, technical process, well field design and drilling engineering, uranium-bearing solution processing, etc. have been fully studied. At current stage, several uranium mines use CO₂+O₂ ISL method to extract uranium from the ore-bearing aquifers. The industrial application and development potential of CO₂+O₂ ISL method in China are summarized. By using CO₂+O₂ neutral leaching technology, the problem of calcium carbonate and calcium sulfate precipitation have been solved during uranium mining. By reasonably regulating the amount of CO₂ and O₂, related ions and hydro-chemical conditions can be controlled within the limited extent for avoiding the occurrence of calcium sulfate and calcium carbonate precipitation. Based on this premise, the demand of CO₂+O₂ uranium leaching has been met to the maximum extent, which not only realizes the effective leaching of uranium, but also avoids the occurrence and precipitation of calcium carbonate and calcium sulfate, realizing the industrial development of the sandstone type uranium deposit.Keywords: CO₂+O₂ ISL, industrial production, well field layout, uranium processing
Procedia PDF Downloads 17628321 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform
Authors: David Jurado, Carlos Ávila
Abstract:
Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis
Procedia PDF Downloads 8328320 The Necessity of Retrofitting for Masonry Buildings in Turkey
Authors: Soner Güler, Mustafa Gülen, Eylem Güzel
Abstract:
Masonry buildings constitute major part of building stock in Turkey. Masonry buildings were built up especially in rural areas and underdeveloped regions due to economic reasons. Almost all of these masonry buildings are not designed and detailed according to any design guidelines by designers. As a result of this, masonry buildings were totally collapsed or heavily damaged when subjected to destructive earthquake effects. Thus, these masonry buildings that were built up in our country must be retrofitted to improve their seismic performance. In this study, new seismic retrofitting techniques that is easy to apply and low-cost are summarized and the importance of seismic retrofitting is also emphasized for existing masonry buildings in Turkey.Keywords: masonry buildings, earthquake effects, seismic retrofitting techniques, seismic performance
Procedia PDF Downloads 34328319 Exploring the Landscape of Information Visualization through a Mark Lombardi Lens
Authors: Alon Friedman, Antonio Sanchez Chinchon
Abstract:
This bibliometric study takes an artistic and storytelling approach to explore the term ”information visualization.” Analyzing over 1008 titles collected from databases that specialize in data visualization research, we examine the titles of these publications to report on the characteristics and development trends in the field. Employing a qualitative methodology, we delve into the titles of these publications, extracting leading terms and exploring the cooccurrence of these terms to gain deeper insights. By systematically analyzing the leading terms and their relationships within the titles, we shed light on the prevailing themes that shape the landscape of ”information visualization” by employing the artist Mark Lombardi’s techniques to visualize our findings. By doing so, this study provides valuable insights into bibliometrics visualization while also opening new avenues for leveraging art and storytelling to enhance data representation.Keywords: bibliometrics analysis, Mark Lombardi design, information visualization, qualitative methodology
Procedia PDF Downloads 9028318 Extended Literature Review on Sustainable Energy by Using Multi-Criteria Decision Making Techniques
Authors: Koray Altintas, Ozalp Vayvay
Abstract:
Increased global issues such as depletion of sources, environmental problems and social inequality triggered public awareness towards finding sustainable solutions in order to ensure the well-being of the current as well as future generations. Since energy plays a significant role in improved social and economic well-being and is imperative on both industrial and commercial wealth creation, it is a must to develop a standardized set of metrics which makes it possible to indicate the present condition relative to conditions in the past and to develop any perspective which is required to frame actions for the future. This is not an easy task by considering the complexity of the issue which requires integrating economic, environmental and social aspects of sustainable energy. Multi-criteria decision making (MCDM) can be considered as a form of integrated sustainability evaluation and a decision support approach that can be used to solve complex problems featuring; conflicting objectives, different forms of data and information, multi-interests and perspectives. On that matter, MCDM methods are useful for providing solutions to complex energy management problems. The aim of this study is to review MCDM approaches that can be used for examining sustainable energy management. This study presents an insight into MCDM techniques and methods that can be useful for engineers, researchers and policy makers working in the energy sector.Keywords: sustainable energy, sustainability criteria, multi-criteria decision making, sustainability dimensions
Procedia PDF Downloads 33028317 Migration and Policy Dedication in Ethiopia: A Study of Migration of Hadiya People
Authors: Solomon Tagesse
Abstract:
This study has been conducted on “Migration and Policy Dedication in Ethiopia: A Study of Migration of Hadiya People,” which was aimed at scrutinizing the reason for Hadiya people's migration from Ethiopia to the Republic of South Africa. Basically, a qualitative research approach has been employed to address the justified problem and to achieve the objectives stated accordingly. The purposive and snowball sampling techniques were employed to identify the study sites and respondents for the study. Field observations, interviews, focus group discussions and recording of data were used as instruments to collect data. It has been observed that lack of jobs and peer pressure, lack of interest to learn and work, scarcity and low wages were push factors; whereas job opportunity and better income; the existence of families, relatives and friends pull factors. Communication, transportation, the existence of brokers, etc., were also found to be intermediary factors. Based on the findings, recommendations and policy suggestions have been made in specific areas of the study.Keywords: migration, reason, migrant, households
Procedia PDF Downloads 18428316 Pregnant Women in Substance Abuse: Transition of Characteristics and Mining of Association from Teds-a 2011 to 2018
Authors: Md Tareq Ferdous Khan, Shrabanti Mazumder, MB Rao
Abstract:
Background: Substance use during pregnancy is a longstanding public health problem that results in severe consequences for pregnant women and fetuses. Methods: Eight (2011-2018) datasets on pregnant women’s admissions are extracted from TEDS-A. Distributions of sociodemographic, substance abuse behaviors, and clinical characteristics are constructed and compared over the years for trends by the Cochran-Armitage test. Market basket analysis is used in mining the association among polysubstance abuse. Results: Over the years, pregnant woman admissions as the percentage of total and female admissions remain stable, where total annual admissions range from 1.54 to about 2 million with the female share of 33.30% to 35.61%. Pregnant women aged 21-29, 12 or more years of education, white race, unemployed, holding independent living status are among the most vulnerable. Concerns prevail on a significant number of polysubstance users, young age at first use, frequency of daily users, and records of prior admissions (60%). Trends of abused primary substances show a significant rise in heroin (66%) and methamphetamine (46%) over the years, although the latest year shows a considerable downturn. On the other hand, significant decreasing patterns are evident for alcohol (43%), marijuana or hashish (24%), cocaine or crack (23%), other opiates or synthetics (36%), and benzodiazepines (29%). Basket analysis reveals some patterns of co-occurrence of substances consistent over the years. Conclusions: This comprehensive study can work as a reference to identify the most vulnerable groups based on their characteristics and deal with the most hazardous substances from their evidence of co-occurrence.Keywords: basket analysis, pregnant women, substance abuse, trend analysis
Procedia PDF Downloads 19528315 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning
Authors: Joseph George, Anne Kotteswara Roa
Abstract:
Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.Keywords: skin cancer, deep learning, performance measures, accuracy, datasets
Procedia PDF Downloads 12928314 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones
Authors: Mohamed Abdelkareem
Abstract:
Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.Keywords: GIS, remote sensing, groundwater, Egypt
Procedia PDF Downloads 9828313 Comparison of Bioelectric and Biomechanical Electromyography Normalization Techniques in Disparate Populations
Authors: Drew Commandeur, Ryan Brodie, Sandra Hundza, Marc Klimstra
Abstract:
The amplitude of raw electromyography (EMG) is affected by recording conditions and often requires normalization to make meaningful comparisons. Bioelectric methods normalize with an EMG signal recorded during a standardized task or from the experimental protocol itself, while biomechanical methods often involve measurements with an additional sensor such as a force transducer. Common bioelectric normalization techniques for treadmill walking include maximum voluntary isometric contraction (MVIC), dynamic EMG peak (EMGPeak) or dynamic EMG mean (EMGMean). There are several concerns with using MVICs to normalize EMG, including poor reliability and potential discomfort. A limitation of bioelectric normalization techniques is that they could result in a misrepresentation of the absolute magnitude of force generated by the muscle and impact the interpretation of EMG between functionally disparate groups. Additionally, methods that normalize to EMG recorded during the task may eliminate some real inter-individual variability due to biological variation. This study compared biomechanical and bioelectric EMG normalization techniques during treadmill walking to assess the impact of the normalization method on the functional interpretation of EMG data. For the biomechanical method, we normalized EMG to a target torque (EMGTS) and the bioelectric methods used were normalization to the mean and peak of the signal during the walking task (EMGMean and EMGPeak). The effect of normalization on muscle activation pattern, EMG amplitude, and inter-individual variability were compared between disparate cohorts of OLD (76.6 yrs N=11) and YOUNG (26.6 yrs N=11) adults. Participants walked on a treadmill at a self-selected pace while EMG was recorded from the right lower limb. EMG data from the soleus (SOL), medial gastrocnemius (MG), tibialis anterior (TA), vastus lateralis (VL), and biceps femoris (BF) were phase averaged into 16 bins (phases) representing the gait cycle with bins 1-10 associated with right stance and bins 11-16 with right swing. Pearson’s correlations showed that activation patterns across the gait cycle were similar between all methods, ranging from r =0.86 to r=1.00 with p<0.05. This indicates that each method can characterize the muscle activation pattern during walking. Repeated measures ANOVA showed a main effect for age in MG for EMGPeak but no other main effects were observed. Interactions between age*phase of EMG amplitude between YOUNG and OLD with each method resulted in different statistical interpretation between methods. EMGTS normalization characterized the fewest differences (four phases across all 5 muscles) while EMGMean (11 phases) and EMGPeak (19 phases) showed considerably more differences between cohorts. The second notable finding was that coefficient of variation, the representation of inter-individual variability, was greatest for EMGTS and lowest for EMGMean while EMGPeak was slightly higher than EMGMean for all muscles. This finding supports our expectation that EMGTS normalization would retain inter-individual variability which may be desirable, however, it also suggests that even when large differences are expected, a larger sample size may be required to observe the differences. Our findings clearly indicate that interpretation of EMG is highly dependent on the normalization method used, and it is essential to consider the strengths and limitations of each method when drawing conclusions.Keywords: electromyography, EMG normalization, functional EMG, older adults
Procedia PDF Downloads 9128312 Enhancing the Dynamic Performance of Grid-Tied Inverters Using Manta Ray Foraging Algorithm
Authors: H. E. Keshta, A. A. Ali
Abstract:
Three phase grid-tied inverters are widely employed in micro-grids (MGs) as interphase between DC and AC systems. These inverters are usually controlled through standard decoupled d–q vector control strategy based on proportional integral (PI) controllers. Recently, advanced meta-heuristic optimization techniques have been used instead of deterministic methods to obtain optimum PI controller parameters. This paper provides a comparative study between the performance of the global Porcellio Scaber algorithm (GPSA) based PI controller and Manta Ray foraging optimization (MRFO) based PI controller.Keywords: micro-grids, optimization techniques, grid-tied inverter control, PI controller
Procedia PDF Downloads 13228311 Factors That Contribute to Noise Induced Hearing Loss Amongst Employees at the Platinum Mine in Limpopo Province, South Africa
Authors: Livhuwani Muthelo, R. N. Malema, T. M. Mothiba
Abstract:
Long term exposure to excessive noise in the mining industry increases the risk of noise induced hearing loss, with consequences for employee’s health, productivity and the overall quality of life. Objective: The objective of this study was to investigate the factors that contribute to Noise Induced Hearing Loss amongst employees at the Platinum mine in the Limpopo Province, South Africa. Study method: A qualitative, phenomenological, exploratory, descriptive, contextual design was applied in order to explore and describe the contributory factors. Purposive non-probability sampling was used to select 10 male employees who were diagnosed with NIHL in the year 2014 in four mine shafts, and 10 managers who were involved in a Hearing Conservation Programme. The data were collected using semi-structured one-on-one interviews. A qualitative data analysis of Tesch’s approach was followed. Results: The following themes emerged: Experiences and challenges faced by employees in the work environment, hearing protective device factors and management and leadership factors. Hearing loss was caused by partial application of guidelines, policies, and procedures from the Department of Minerals and Energy. Conclusion: The study results indicate that although there are guidelines, policies, and procedures available, failure in the implementation of one element will affect the development and maintenance of employees hearing mechanism. It is recommended that the mine management should apply the guidelines, policies, and procedures and promptly repair the broken hearing protective devices.Keywords: employees, factors, noise induced hearing loss, noise exposure
Procedia PDF Downloads 12728310 Examining the Role of Farmer-Centered Participatory Action Learning in Building Sustainable Communities in Rural Haiti
Authors: Charles St. Geste, Michael Neumann, Catherine Twohig
Abstract:
Our primary aim is to examine farmer-centered participatory action learning as a tool to improve agricultural production, build resilience to climate shocks and, more broadly, advance community-driven solutions for sustainable development in rural communities across Haiti. For over six years, sixty plus farmers from Deslandes, Haiti, organized in three traditional work groups called konbits, have designed and tested low-input agroecology techniques as part of the Konbit Vanyan Kapab Pwoje Agroekoloji. The project utilizes a participatory action learning approach, emphasizing social inclusion, building on local knowledge, experiential learning, active farmer participation in trial design and evaluation, and cross-community sharing. Mixed methods were used to evaluate changes in knowledge and adoption of agroecology techniques, confidence in advancing agroecology locally, and innovation among Konbit Vanyan Kapab farmers. While skill and knowledge in application of agroecology techniques varied among individual farmers, a majority of farmers successfully adopted techniques outside of the trial farms. The use of agroecology techniques on trial and individual farms has doubled crop production in many cases. Farm income has also increased, and farmers report less damage to crops and property caused by extreme weather events. Furthermore, participatory action strategies have led to greater local self-determination and greater capacity for sustainable community development. With increased self-confidence and the knowledge and skills acquired from participating in the project, farmers prioritized sharing their successful techniques with other farmers and have developed a farmer-to-farmer training program that incorporates participatory action learning. Using adult education methods, farmers, trained as agroecology educators, are currently providing training in sustainable farming practices to farmers from five villages in three departments across Haiti. Konbit Vanyan Kapab farmers have also begun testing production of value-added food products, including a dried soup mix and tea. Key factors for success include: opportunities for farmers to actively participate in all phases of the project, group diversity, resources for application of agroecology techniques, focus on group processes and overcoming local barriers to inclusive decision-making.Keywords: agroecology, participatory action learning, rural Haiti, sustainable community development
Procedia PDF Downloads 15628309 Recurrence of Pterygium after Surgery and the Effect of Surgical Technique on the Recurrence of Pterygium in Patients with Pterygium
Authors: Luksanaporn Krungkraipetch
Abstract:
A pterygium is an eye surface lesion that begins in the limbal conjunctiva and progresses to the cornea. The lesion is more common in the nasal limbus than in the temporal, and it has a distinctive wing-like aspect. Indications for surgery, in decreasing order of significance, are grown over the corneal center, decreased vision due to corneal deformation, documented growth, sensations of discomfort, and aesthetic concerns. Recurrent pterygium results in the loss of time, the expense of therapy, and the potential for vision impairment. The objective of this study is to find out how often the recurrence of pterygium after surgery occurs, what effect the surgery technique has, and what causes them to come back in people with pterygium. Materials and Methods: Observational case control in retrospect: the study involves a retrospective analysis of 164 patient samples. Data analysis is descriptive statistics analysis, i.e., basic data details about pterygium surgery and the risk of recurrent pterygium. For factor analysis, the inferential statistics odds ratio (OR) and 95% confidence interval (CI) ANOVA are utilized. A p-value of 0.05 was deemed statistically important. Results: The majority of patients, according to the results, were female (60.4%). Twenty-four of the 164 (14.6%) patients who underwent surgery exhibited recurrent pterygium. The average age is 55.33 years old. Postoperative recurrence was reported in 19 cases (79.3%) of bare sclera techniques and five cases (20.8%) of conjunctival autograft techniques. The recurrence interval is 10.25 months, with the most common (54.17 percent) being 12 months. In 91.67 percent of cases, all follow-ups are successful. The most common recurrence level is 1 (25%). A surgical complication is a subconjunctival hemorrhage (33.33 percent). Comparing the surgeries done on people with recurrent pterygium didn't show anything important (F = 1.13, p = 0.339). Age significantly affected the recurrence of pterygium (95% CI, 6.79-63.56; OR = 20.78, P 0.001). Conclusion: This study discovered a 14.6% rate of pterygium recurrence after pterygium surgery. Across all surgeries and patients, the rate of recurrence was four times higher with the bare sclera method than with conjunctival autograft. The researchers advise selecting a more conventional surgical technique to avoid a recurrence.Keywords: pterygium, recurrence pterygium, pterygium surgery, excision pterygium
Procedia PDF Downloads 8828308 Identification and Classification of Medicinal Plants of Indian Himalayan Region Using Hyperspectral Remote Sensing and Machine Learning Techniques
Authors: Kishor Chandra Kandpal, Amit Kumar
Abstract:
The Indian Himalaya region harbours approximately 1748 plants of medicinal importance, and as per International Union for Conservation of Nature (IUCN), the 112 plant species among these are threatened and endangered. To ease the pressure on these plants, the government of India is encouraging its in-situ cultivation. The Saussurea costus, Valeriana jatamansi, and Picrorhiza kurroa have also been prioritized for large scale cultivation owing to their market demand, conservation value and medicinal properties. These species are found from 1000 m to 4000 m elevation ranges in the Indian Himalaya. Identification of these plants in the field requires taxonomic skills, which is one of the major bottleneck in the conservation and management of these plants. In recent years, Hyperspectral remote sensing techniques have been precisely used for the discrimination of plant species with the help of their unique spectral signatures. In this background, a spectral library of the above 03 medicinal plants was prepared by collecting the spectral data using a handheld spectroradiometer (325 to 1075 nm) from farmer’s fields of Himachal Pradesh and Uttarakhand states of Indian Himalaya. The Random forest (RF) model was implied on the spectral data for the classification of the medicinal plants. The 80:20 standard split ratio was followed for training and validation of the RF model, which resulted in training accuracy of 84.39 % (kappa coefficient = 0.72) and testing accuracy of 85.29 % (kappa coefficient = 0.77). This RF classifier has identified green (555 to 598 nm), red (605 nm), and near-infrared (725 to 840 nm) wavelength regions suitable for the discrimination of these species. The findings of this study have provided a technique for rapid and onsite identification of the above medicinal plants in the field. This will also be a key input for the classification of hyperspectral remote sensing images for mapping of these species in farmer’s field on a regional scale. This is a pioneer study in the Indian Himalaya region for medicinal plants in which the applicability of hyperspectral remote sensing has been explored.Keywords: himalaya, hyperspectral remote sensing, machine learning; medicinal plants, random forests
Procedia PDF Downloads 20328307 Recovery of Au and Other Metals from Old Electronic Components by Leaching and Liquid Extraction Process
Authors: Tomasz Smolinski, Irena Herdzik-Koniecko, Marta Pyszynska, M. Rogowski
Abstract:
Old electronic components can be easily found nowadays. Significant quantities of valuable metals such as gold, silver or copper are used for the production of advanced electronic devices. Old useless electronic device slowly became a new source of precious metals, very often more efficient than natural. For example, it is possible to recover more gold from 1-ton personal computers than seventeen tons of gold ore. It makes urban mining industry very profitable and necessary for sustainable development. For the recovery of metals from waste of electronic equipment, various treatment options based on conventional physical, hydrometallurgical and pyrometallurgical processes are available. In this group hydrometallurgy processes with their relatively low capital cost, low environmental impact, potential for high metal recoveries and suitability for small scale applications, are very promising options. Institute of Nuclear Chemistry and Technology has great experience in hydrometallurgy processes especially focused on recovery metals from industrial and agricultural wastes. At the moment, urban mining project is carried out. The method of effective recovery of valuable metals from central processing units (CPU) components has been developed. The principal processes such as acidic leaching and solvent extraction were used for precious metals recovery from old processors and graphic cards. Electronic components were treated by acidic solution at various conditions. Optimal acid concentration, time of the process and temperature were selected. Precious metals have been extracted to the aqueous phase. At the next step, metals were selectively extracted by organic solvents such as oximes or tributyl phosphate (TBP) etc. Multistage mixer-settler equipment was used. The process was optimized.Keywords: electronic waste, leaching, hydrometallurgy, metal recovery, solvent extraction
Procedia PDF Downloads 13728306 Public Behavior When Encountered with a Road Traffic Accident
Authors: H. N. S. Silva, S. N. Silva
Abstract:
Introduction: The latest WHO data published in 2014 states that Sri Lanka has reached 2,773 of total deaths and over 14000 individuals’ sustained injuries due to RTAs each year. It was noticed in previous studies that policemen, three wheel drivers and also pedestrians were the first to respond to RTAs but the victim’s condition was aggravated due to unskilled attempts made by the responders while management of the victim’s wounds, moving and positioning of the victims and also mainly while transportation of the victims. Objective: To observe the practices of the urban public in Sri Lanka who are encountered with RTAs. Methods: A qualitative study was done to analyze public behavior seen on video recordings of scenes of accidents purposefully selected from social media, news websites, YouTube and Google. Results: The results showed that all individuals who tried to help during the RTA were middle aged men, who were mainly pedestrians, motorcyclists and policemen during that moment. Vast majority were very keen to actively help the victims to get to hospital as soon as possible and actively participated in providing 'aid'. But main problem was the first aid attempts were disorganized and uncoordinated. Even though all individuals knew how to control external bleeding, none of them was aware of spinal prevention techniques or management of limb injuries. Most of the transportation methods and transfer techniques used were inappropriate and more injury prone. Conclusions: The public actively engages in providing aid despite their inappropriate practices in giving first aid.Keywords: encountered, pedestrians, road traffic accidents, urban public
Procedia PDF Downloads 286