Search results for: weather classification
969 Diagnosis of Alzheimer Diseases in Early Step Using Support Vector Machine (SVM)
Authors: Amira Ben Rabeh, Faouzi Benzarti, Hamid Amiri, Mouna Bouaziz
Abstract:
Alzheimer is a disease that affects the brain. It causes degeneration of nerve cells (neurons) and in particular cells involved in memory and intellectual functions. Early diagnosis of Alzheimer Diseases (AD) raises ethical questions, since there is, at present, no cure to offer to patients and medicines from therapeutic trials appear to slow the progression of the disease as moderate, accompanying side effects sometimes severe. In this context, analysis of medical images became, for clinical applications, an essential tool because it provides effective assistance both at diagnosis therapeutic follow-up. Computer Assisted Diagnostic systems (CAD) is one of the possible solutions to efficiently manage these images. In our work; we proposed an application to detect Alzheimer’s diseases. For detecting the disease in early stage we used the three sections: frontal to extract the Hippocampus (H), Sagittal to analysis the Corpus Callosum (CC) and axial to work with the variation features of the Cortex(C). Our method of classification is based on Support Vector Machine (SVM). The proposed system yields a 90.66% accuracy in the early diagnosis of the AD.Keywords: Alzheimer Diseases (AD), Computer Assisted Diagnostic(CAD), hippocampus, Corpus Callosum (CC), cortex, Support Vector Machine (SVM)
Procedia PDF Downloads 384968 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks
Authors: Radhika Ranjan Roy
Abstract:
Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve
Procedia PDF Downloads 78967 Evaluation of Features Extraction Algorithms for a Real-Time Isolated Word Recognition System
Authors: Tomyslav Sledevič, Artūras Serackis, Gintautas Tamulevičius, Dalius Navakauskas
Abstract:
This paper presents a comparative evaluation of features extraction algorithm for a real-time isolated word recognition system based on FPGA. The Mel-frequency cepstral, linear frequency cepstral, linear predictive and their cepstral coefficients were implemented in hardware/software design. The proposed system was investigated in the speaker-dependent mode for 100 different Lithuanian words. The robustness of features extraction algorithms was tested recognizing the speech records at different signals to noise rates. The experiments on clean records show highest accuracy for Mel-frequency cepstral and linear frequency cepstral coefficients. For records with 15 dB signal to noise rate the linear predictive cepstral coefficients give best result. The hard and soft part of the system is clocked on 50 MHz and 100 MHz accordingly. For the classification purpose, the pipelined dynamic time warping core was implemented. The proposed word recognition system satisfies the real-time requirements and is suitable for applications in embedded systems.Keywords: isolated word recognition, features extraction, MFCC, LFCC, LPCC, LPC, FPGA, DTW
Procedia PDF Downloads 496966 Formation Flying Design Applied for an Aurora Borealis Monitoring Mission
Authors: Thais Cardoso Franco, Caio Nahuel Sousa Fagonde, Willer Gomes dos Santos
Abstract:
Aurora Borealis is an optical phenomenon composed of luminous events observed in the night skies in the polar regions resulting from disturbances in the magnetosphere due to the impact of solar wind particles with the Earth's upper atmosphere, channeled by the Earth's magnetic field, which causes atmospheric molecules to become excited and emit electromagnetic spectrum, leading to the display of lights in the sky. However, there are still different implications of this phenomenon under study: high intensity auroras are often accompanied by geomagnetic storms that cause blackouts on Earth and impair the transmission of signals from the Global Navigation Satellite Systems (GNSS). Auroras are also known to occur on other planets and exoplanets, so the activity is an indication of active space weather conditions that can aid in learning about the planetary environment. In order to improve understanding of the phenomenon, this research aims to design a satellite formation flying solution for collecting and transmitting data for monitoring aurora borealis in northern hemisphere, an approach that allows studying the event with multipoint data collection in a reduced time interval, in order to allow analysis from the beginning of the phenomenon until its decline. To this end, the ideal number of satellites, the spacing between them, as well as the ideal topology to be used will be analyzed. From an orbital study, approaches from different altitudes, eccentricities and inclinations will also be considered. Given that at large relative distances between satellites in formation, controllers tend to fail, a study on the efficiency of nonlinear adaptive control methods from the point of view of position maintenance and propellant consumption will be carried out. The main orbital perturbations considered in the simulation: non-homogeneity terrestrial, atmospheric drag, gravitational action of the Sun and the Moon, accelerations due to solar radiation pressure and relativistic effects.Keywords: formation flying, nonlinear adaptive control method, aurora borealis, adaptive SDRE method
Procedia PDF Downloads 39965 Blind Channel Estimation for Frequency Hopping System Using Subspace Based Method
Authors: M. M. Qasaymeh, M. A. Khodeir
Abstract:
Subspace channel estimation methods have been studied widely. It depends on subspace decomposition of the covariance matrix to separate signal subspace from noise subspace. The decomposition normally is done by either Eigenvalue Decomposition (EVD) or Singular Value Decomposition (SVD) of the Auto-Correlation matrix (ACM). However, the subspace decomposition process is computationally expensive. In this paper, the multipath channel estimation problem for a Slow Frequency Hopping (SFH) system using noise space based method is considered. An efficient method to estimate multipath the time delays basically is proposed, by applying MUltiple Signal Classification (MUSIC) algorithm which used the null space extracted by the Rank Revealing LU factorization (RRLU). The RRLU provides accurate information about the rank and the numerical null space which make it a valuable tool in numerical linear algebra. The proposed novel method decreases the computational complexity approximately to the half compared with RRQR methods keeping the same performance. Computer simulations are also included to demonstrate the effectiveness of the proposed scheme.Keywords: frequency hopping, channel model, time delay estimation, RRLU, RRQR, MUSIC, LS-ESPRIT
Procedia PDF Downloads 410964 A Different Approach to Smart Phone-Based Wheat Disease Detection System Using Deep Learning for Ethiopia
Authors: Nathenal Thomas Lambamo
Abstract:
Based on the fact that more than 85% of the labor force and 90% of the export earnings are taken by agriculture in Ethiopia and it can be said that it is the backbone of the overall socio-economic activities in the country. Among the cereal crops that the agriculture sector provides for the country, wheat is the third-ranking one preceding teff and maize. In the present day, wheat is in higher demand related to the expansion of industries that use them as the main ingredient for their products. The local supply of wheat for these companies covers only 35 to 40% and the rest 60 to 65% percent is imported on behalf of potential customers that exhaust the country’s foreign currency reserves. The above facts show that the need for this crop in the country is too high and in reverse, the productivity of the crop is very less because of these reasons. Wheat disease is the most devastating disease that contributes a lot to this unbalance in the demand and supply status of the crop. It reduces both the yield and quality of the crop by 27% on average and up to 37% when it is severe. This study aims to detect the most frequent and degrading wheat diseases, Septoria and Leaf rust, using the most efficiently used subset of machine learning technology, deep learning. As a state of the art, a deep learning class classification technique called Convolutional Neural Network (CNN) has been used to detect diseases and has an accuracy of 99.01% is achieved.Keywords: septoria, leaf rust, deep learning, CNN
Procedia PDF Downloads 76963 Epileptic Seizure Onset Detection via Energy and Neural Synchronization Decision Fusion
Authors: Marwa Qaraqe, Muhammad Ismail, Erchin Serpedin
Abstract:
This paper presents a novel architecture for a patient-specific epileptic seizure onset detector using scalp electroencephalography (EEG). The proposed architecture is based on the decision fusion calculated from energy and neural synchronization related features. Specifically, one level of the detector calculates the condition number (CN) of an EEG matrix to evaluate the amount of neural synchronization present within the EEG channels. On a parallel level, the detector evaluates the energy contained in four EEG frequency subbands. The information is then fed into two independent (parallel) classification units based on support vector machines to determine the onset of a seizure event. The decisions from the two classifiers are then combined together according to two fusion techniques to determine a global decision. Experimental results demonstrate that the detector based on the AND fusion technique outperforms existing detectors with a sensitivity of 100%, detection latency of 3 seconds, while it achieves a 2:76 false alarm rate per hour. The OR fusion technique achieves a sensitivity of 100%, and significantly improves delay latency (0:17 seconds), yet it achieves 12 false alarms per hour.Keywords: epilepsy, EEG, seizure onset, electroencephalography, neuron, detection
Procedia PDF Downloads 478962 Resurgence of Influenza A (H1N1) Pdm09 during November 2015 - February 2016, Pakistan
Authors: Nazish Badar
Abstract:
Background: To investigate the epidemic resurgent wave of influenza A (H1N1) pdm09 infections during 2015-16 Influenza season(Nov,15 –Feb,16) we compared epidemiological features of influenza A (H1N1) pdm09 associated hospitalizations and deaths during this period in Pakistan. Methods: Respiratory samples were tested using CDC Real-Time RT-PCR protocols. Demographic and epidemiological data was analyzed using SPSS. Risk ratio was calculated between age groups to compare patients that were hospitalized and died due to influenza A (H1N1) pdm09 during this period. Results: A total of 1970 specimens were analyzed; influenza virus was detected in 494(25%) samples, including 458(93%) Influenza type A and 36(7%) influenza type B viruses. Amongst influenza A viruses, 351(77%) A(H1N1) pdm09 and 107(23%) were A/H3N2. Influenza A(H1N1)pdm09 peaked in January 2016 when 250(54%) of tested patients were positive. The resurgent waves increased hospitalizations due to pdmH1N1 as compared to the rest part of the year. Overall 267(76%) A(H1N1) pdm09 cases were hospitalized. Adults ≥18 years showed the highest relative risk of hospitalization (1.2). Median interval of hospitalization and symptom onset was five days for all age groups. During this period, a total of 34 laboratory-confirmed deaths associated with pandemic influenza A (H1N1) were reported out of 1970 cases, the case fatality rate was 1.72%. the male to female ratio was 2:1in reported deaths. The majority of the deaths during that period occurred in adults ≥18 years of age. Overall median age of the death cases was 42.8 years with underlying medical conditions. The median number of days between symptom onset was two days. The diagnosis upon admission in influenza-associated fatal cases was pneumonia (53%). Acute Respiratory Distress Syndrome 9 (26%), eight out of which (88%) required mechanical ventilation. Conclusions: The present resurgence of pandemic virus cannot be attributed to a single factor. The prolong cold and dry weather, possibility of drift in virus and absence of annual flu vaccination may have played an integrated role in resurfacing of pandemic virus.Keywords: influenza A (H1N1)pdm 09, resurgence, epidemiology, Pakistan
Procedia PDF Downloads 197961 Theorising Chinese as a Foreign Language Curriculum Justice in the Australian School Context
Authors: Wen Xu
Abstract:
The expansion of Confucius institutes and Chinese as a Foreign Language (CFL) education is often considered as cultural invasion and part of much bigger, if not ambitious, Chinese central government agenda among Western public opinion. The CFL knowledge and teaching practice inherent in textbooks are also harshly critiqued as failing to align with Western educational principles. This paper takes up these concerns and attempts to articulate that Confucius’s idea of ‘education without discrimination’ appears to have become synonymous with social justice touted in contemporary Australian education and policy discourses. To do so, it capitalises on Bernstein's conceptualization of classification and pedagogic rights to articulate CFL curriculum's potential of drawing in and drawing out curriculum boundaries to achieve educational justice. In this way, the potential useful knowledge of CFL constitutes a worthwhile tool to engage in a peripheral Western country’s education issues, as well as to include disenfranchised students in the multicultural Australian society. It opens spaces for critically theorising CFL curricular justice in Australian educational contexts, and makes an original contribution to scholarly argumentation that CFL curriculum has the potential of including socially and economically disenfranchised students in schooling.Keywords: curriculum justice, Chinese as a Foreign Language curriculum, Bernstein, equity
Procedia PDF Downloads 144960 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies
Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan
Abstract:
The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping
Procedia PDF Downloads 98959 The Initiation of Privatization, Market Structure, and Free Entry with Vertically Related Markets
Authors: Hung-Yi Chen, Shih-Jye Wu
Abstract:
The existing literature provides little discussion on why a public monopolist gives up its market dominant position and allows private firms entering the market. We argue that the privatization of a public monopolist under a vertically related market may induce the entry of private firms. We develop a model of a mixed oligopoly with vertically related markets to explain the change in the market from a public monopolist to a mixed oligopoly and examine issues on privatizing the downstream public enterprise both in the short run and long run in the vertically related markets. We first show that the welfare-maximizing public monopoly firm is suboptimal in the vertically related markets. This is due to the fact that the privatization will reduce the input price charged by the upstream foreign monopolist. Further, the privatization will induce the entry of private firms since input price will decrease after privatization. Third, we demonstrate that the complete privatizing the public firm becomes a possible solution if the entry cost of private firm is low. Finally, we indicate that the public firm should partially privatize if the free-entry of private firms is allowed. JEL classification: F12, F14, L32, L33Keywords: free entry, mixed oligopoly, public monopoly, the initiation of privatization, vertically related markets, mixed oligopoly
Procedia PDF Downloads 137958 A New DIDS Design Based on a Combination Feature Selection Approach
Authors: Adel Sabry Eesa, Adnan Mohsin Abdulazeez Brifcani, Zeynep Orman
Abstract:
Feature selection has been used in many fields such as classification, data mining and object recognition and proven to be effective for removing irrelevant and redundant features from the original data set. In this paper, a new design of distributed intrusion detection system using a combination feature selection model based on bees and decision tree. Bees algorithm is used as the search strategy to find the optimal subset of features, whereas decision tree is used as a judgment for the selected features. Both the produced features and the generated rules are used by Decision Making Mobile Agent to decide whether there is an attack or not in the networks. Decision Making Mobile Agent will migrate through the networks, moving from node to another, if it found that there is an attack on one of the nodes, it then alerts the user through User Interface Agent or takes some action through Action Mobile Agent. The KDD Cup 99 data set is used to test the effectiveness of the proposed system. The results show that even if only four features are used, the proposed system gives a better performance when it is compared with the obtained results using all 41 features.Keywords: distributed intrusion detection system, mobile agent, feature selection, bees algorithm, decision tree
Procedia PDF Downloads 408957 Integrating Insulated Concrete Form (ICF) with Solar-Driven Reverse Osmosis Desalination for Building Integrated Energy Storage in Cold Climates
Authors: Amirhossein Eisapour, Mohammad Emamjome Kashan, Alan S. Fung
Abstract:
This research addresses the pressing global challenges of clean energy and water supplies, emphasizing the need for sustainable solutions for the building sector. The research centers on integrating Reverse Osmosis (RO) systems with building energy systems, incorporating Solar Thermal Collectors (STC)/Photovoltaic Thermal (PVT), water-to-water heat pumps, and an Insulated Concrete Form (ICF) based building foundation wall thermal energy storage. The study explores an innovative configuration’s effectiveness in addressing water and heating demands through clean energy sources while addressing ICF-based thermal storage challenges, which could overheat in the cooling season. Analyzing four configurations—STC-ICF, STC-ICF-RO, PVT-ICF, and PVT-ICF-RO, the study conducts a sensitivity analysis on collector area (25% and 50% increase) and weather data (evaluating five Canadian cities, Winnipeg, Toronto, Edmonton, Halifax and Vancouver). Key outcomes highlight the benefits of integrated RO scenarios, showcasing reduced ICF wall temperature, diminished unwanted heat in the cooling season, reduced RO pump consumption and enhanced solar energy production. The STC-ICF-RO and PVT-ICF-RO systems achieved energy savings of 653 kWh and 131 kWh, respectively, in comparison to their non-integrated RO counterparts. Additionally, both systems successfully contributed to lowering the CO2 production level of the energy system. The calculated payback period of STC-ICF-RO (2 years) affirms the proposed systems’ economic viability. Compared to the base system, which does not benefit from the ICF and RO integration with the building energy system, the STC-ICF-RO and PVT-ICF-RO demonstrate a dramatic energy consumption reduction of 20% and 32%, respectively. The sensitivity analysis suggests potential system improvements under specific conditions, especially when implementing the introduced energy system in communities of buildings.Keywords: insulated concrete form, thermal energy storage, reverse osmosis, building energy systems, solar thermal collector, photovoltaic thermal, heat pump
Procedia PDF Downloads 54956 The Determinants of Country Corruption: Unobserved Heterogeneity and Individual Choice- An empirical Application with Finite Mixture Models
Authors: Alessandra Marcelletti, Giovanni Trovato
Abstract:
Corruption in public offices is found to be the reflection of country-specific features, however, the exact magnitude and the statistical significance of its determinants effect has not yet been identified. The paper aims to propose an estimation method to measure the impact of country fundamentals on corruption, showing that covariates could differently affect the extent of corruption across countries. Thus, we exploit a model able to take into account different factors affecting the incentive to ask or to be asked for a bribe, coherently with the use of the Corruption Perception Index. We assume that discordant results achieved in literature may be explained by omitted hidden factors affecting the agents' decision process. Moreover, assuming homogeneous covariates effect may lead to unreliable conclusions since the country-specific environment is not accounted for. We apply a Finite Mixture Model with concomitant variables to 129 countries from 1995 to 2006, accounting for the impact of the initial conditions in the socio-economic structure on the corruption patterns. Our findings confirm the hypothesis of the decision process of accepting or asking for a bribe varies with specific country fundamental features.Keywords: Corruption, Finite Mixture Models, Concomitant Variables, Countries Classification
Procedia PDF Downloads 264955 Accounting for Cryptocurrency: Urgent Need for an Accounting Standard
Authors: Fatima Ali Abbass, Hassan Ibrahim Rkein
Abstract:
The number of entities worldwide that currently accept digital currency as payment is increasing; however, digital currency still is not widely accepted as a medium of exchange, nor they represent legal tender. At the same time, this makes accounting for cryptocurrency, as cash (Currency) is not possible under IAS 7 and IAS 32, Cryptocurrency also cannot be accounted for as Financial Assets at fair value through profit or loss under IFRS 9. Therefore, this paper studies the possible means to account for Cryptocurrency, since, as of today, there is not yet an accounting standard that deals with cryptocurrency. The request to have a specific accounting standard is increasing from top accounting firms and from professional accounting bodies. This study uses a mixture of qualitative and quantitative analysis in its quest to explore the best possible way to account for cryptocurrency. Interviews and surveys were conducted targeting accounting professionals. This study highlighted the deficiencies in the current way of accounting for Cryptocurrency as intangible Assets with an indefinite life. The deficiency becomes well highlighted, as the asset will then be subject to impairment, where under GAAP, only depreciation in the value of the intangible asset is recognized. On the other hand, appreciation in the value of the asset is ignored, and this prohibits the reporting entity from showing the true value of the cryptocurrency asset. This research highlights the gap that arises due to using accounting standards that are not specific for Cryptocurrency and this study confirmed that there is an urgent need to call upon the accounting standards setters (IASB and FASB) to issue accounting standards specifically for Cryptocurrency.Keywords: cryptocurrency, accounting, IFRS, GAAP, classification, measurement
Procedia PDF Downloads 96954 Adaptation Nature-Based Solutions: CBA of Woodlands for Flood Risk Management in the Aire Catchment, UK
Authors: Olivia R. Rendon
Abstract:
More than half of the world population lives in cities, in the UK, for example, 82% of the population was urban by 2013. Cities concentrate valuable and numerous infrastructure and sectors of the national economies. Cities are particularly vulnerable to climate change which will lead to higher damage costs in the future. There is thus a need to develop and invest in adaptation measures for cities to reduce the impact of flooding and other extreme weather events. Recent flood episodes present a significant and growing challenge to the UK and the estimated cost of urban flood damage is 270 million a year for England and Wales. This study aims to carry out cost-benefit analysis (CBA) of a nature-based approach for flood risk management in cities, focusing on the city of Leeds and the wider Aire catchment as a case study. Leeds was chosen as a case study due to its being one of the most flood vulnerable cities in the UK. In Leeds, over 4,500 properties are currently vulnerable to flooding and approximately £450 million of direct damage is estimated for a potential major flood from the River Aire. Leeds is also the second largest Metropolitan District in England with a projected population of 770,000 for 2014. So far the city council has mainly focused its flood risk management efforts on hard infrastructure solutions for the city centre. However, the wider Leeds district is at significant flood risk which could benefit from greener adaptation measures. This study presents estimates of a nature-based adaptation approach for flood risk management in Leeds. This land use management estimate is based on generating costings utilising primary and secondary data. This research contributes findings on the costs of different adaptation measures to flood risk management in a UK city, including the trade-offs and challenges of utilising nature-based solutions. Results also explore the potential implementation of the adaptation measures in the case study and the challenges of data collection and analysis for adaptation in flood risk management.Keywords: green infrastructure, ecosystem services, woodland, adaptation, flood risk
Procedia PDF Downloads 288953 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia
Authors: Nguyen-Thanh Son
Abstract:
Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.Keywords: MODIS, flood, mapping, Cambodia
Procedia PDF Downloads 126952 Psoriasis Diagnostic Test Development: Exploratory Study
Authors: Salam N. Abdo, Orien L. Tulp, George P. Einstein
Abstract:
The purpose of this exploratory study was to gather the insights into psoriasis etiology, treatment, and patient experience, for developing psoriasis and psoriatic arthritis diagnostic test. Data collection methods consisted of a comprehensive meta-analysis of relevant studies and psoriasis patient survey. Established meta-analysis guidelines were used for the selection and qualitative comparative analysis of psoriasis and psoriatic arthritis research studies. Only studies that clearly discussed psoriasis etiology, treatment, and patient experience were reviewed and analyzed, to establish a qualitative data base for the study. Using the insights gained from meta-analysis, an existing psoriasis patient survey was modified and administered to collect additional data as well as triangulate the results. The hypothesis is that specific types of psoriatic disease have specific etiology and pathophysiologic pattern. The following etiology categories were identified: bacterial, environmental/microbial, genetic, immune, infectious, trauma/stress, and viral. Additional results, obtained from meta-analysis and confirmed by patient survey, were the common age of onset (early to mid-20s) and type of psoriasis (plaque; mild; symmetrical; scalp, chest, and extremities, specifically elbows and knees). Almost 70% of patients reported no prescription drug use due to severe side effects and prohibitive cost. These results will guide the development of psoriasis and psoriatic arthritis diagnostic test. The significant number of medical publications classified psoriatic arthritis disease as inflammatory of an unknown etiology. Thus numerous meta-analyses struggle to report any meaningful conclusions since no definitive results have been reported to date. Therefore, return to the basics is an essential step to any future meaningful results. To date, medical literature supports the fact that psoriatic disease in its current classification could be misidentifying subcategories, which in turn hinders the success of studies conducted to date. Moreover, there has been an enormous commercial support to pursue various immune-modulation therapies, thus following a narrow hypothesis/mechanism of action that is yet to yield resolution of disease state. Recurrence and complications may be considered unacceptable in a significant number of these studies. The aim of the ongoing study is to focus on a narrow subgroup of patient population, as identified by this exploratory study via meta-analysis and patient survey, and conduct an exhaustive work up, aiming at mechanism of action and causality before proposing a cure or therapeutic modality. Remission in psoriasis has been achieved and documented in medical literature, such as immune-modulation, phototherapy, various over-the-counter agents, including salts and tar. However, there is no psoriasis and psoriatic arthritis diagnostic test to date, to guide the diagnosis and treatment of this debilitating and, thus far, incurable disease. Because psoriasis affects approximately 2% of population, the results of this study may affect the treatment and improve the quality of life of a significant number of psoriasis patients, potentially millions of patients in the United States alone and many more millions worldwide.Keywords: biologics, early diagnosis, etiology, immune disease, immune modulation therapy, inflammation skin disorder, phototherapy, plaque psoriasis, psoriasis, psoriasis classification, psoriasis disease marker, psoriasis diagnostic test, psoriasis marker, psoriasis mechanism of action, psoriasis treatment, psoriatic arthritis, psoriatic disease, psoriatic disease marker, psoriatic patient experience, psoriatic patient quality of life, remission, salt therapy, targeted immune therapy
Procedia PDF Downloads 118951 A Comparative Study of the Techno-Economic Performance of the Linear Fresnel Reflector Using Direct and Indirect Steam Generation: A Case Study under High Direct Normal Irradiance
Authors: Ahmed Aljudaya, Derek Ingham, Lin Ma, Kevin Hughes, Mohammed Pourkashanian
Abstract:
Researchers, power companies, and state politicians have given concentrated solar power (CSP) much attention due to its capacity to generate large amounts of electricity whereas overcoming the intermittent nature of solar resources. The Linear Fresnel Reflector (LFR) is a well-known CSP technology type for being inexpensive, having a low land use factor, and suffering from low optical efficiency. The LFR was considered a cost-effective alternative option to the Parabolic Trough Collector (PTC) because of its simplistic design, and this often outweighs its lower efficiency. The LFR has been found to be a promising option for directly producing steam to a thermal cycle in order to generate low-cost electricity, but also it has been shown to be promising for indirect steam generation. The purpose of this important analysis is to compare the annual performance of the Direct Steam Generation (DSG) and Indirect Steam Generation (ISG) of LFR power plants using molten salt and other different Heat Transfer Fluids (HTF) to investigate their technical and economic effects. A 50 MWe solar-only system is examined as a case study for both steam production methods in extreme weather conditions. In addition, a parametric analysis is carried out to determine the optimal solar field size that provides the lowest Levelized Cost of Electricity (LCOE) while achieving the highest technical performance. As a result of optimizing the optimum solar field size, the solar multiple (SM) is found to be between 1.2 – 1.5 in order to achieve as low as 9 Cent/KWh for the direct steam generation of the linear Fresnel reflector. In addition, the power plant is capable of producing around 141 GWh annually and up to 36% of the capacity factor, whereas the ISG produces less energy at a higher cost. The optimization results show that the DSG’s performance overcomes the ISG in producing around 3% more annual energy, 2% lower LCOE, and 28% less capital cost.Keywords: concentrated solar power, levelized cost of electricity, linear Fresnel reflectors, steam generation
Procedia PDF Downloads 111950 Image Recognition and Anomaly Detection Powered by GANs: A Systematic Review
Authors: Agastya Pratap Singh
Abstract:
Generative Adversarial Networks (GANs) have emerged as powerful tools in the fields of image recognition and anomaly detection due to their ability to model complex data distributions and generate realistic images. This systematic review explores recent advancements and applications of GANs in both image recognition and anomaly detection tasks. We discuss various GAN architectures, such as DCGAN, CycleGAN, and StyleGAN, which have been tailored to improve accuracy, robustness, and efficiency in visual data analysis. In image recognition, GANs have been used to enhance data augmentation, improve classification models, and generate high-quality synthetic images. In anomaly detection, GANs have proven effective in identifying rare and subtle abnormalities across various domains, including medical imaging, cybersecurity, and industrial inspection. The review also highlights the challenges and limitations associated with GAN-based methods, such as instability during training and mode collapse, and suggests future research directions to overcome these issues. Through this review, we aim to provide researchers with a comprehensive understanding of the capabilities and potential of GANs in transforming image recognition and anomaly detection practices.Keywords: generative adversarial networks, image recognition, anomaly detection, DCGAN, CycleGAN, StyleGAN, data augmentation
Procedia PDF Downloads 20949 Prediction of Survival Rate after Gastrointestinal Surgery Based on The New Japanese Association for Acute Medicine (JAAM Score) With Neural Network Classification Method
Authors: Ayu Nabila Kusuma Pradana, Aprinaldi Jasa Mantau, Tomohiko Akahoshi
Abstract:
The incidence of Disseminated intravascular coagulation (DIC) following gastrointestinal surgery has a poor prognosis. Therefore, it is important to determine the factors that can predict the prognosis of DIC. This study will investigate the factors that may influence the outcome of DIC in patients after gastrointestinal surgery. Eighty-one patients were admitted to the intensive care unit after gastrointestinal surgery in Kyushu University Hospital from 2003 to 2021. Acute DIC scores were estimated using the new Japanese Association for Acute Medicine (JAAM) score from before and after surgery from day 1, day 3, and day 7. Acute DIC scores will be compared with The Sequential Organ Failure Assessment (SOFA) score, platelet count, lactate level, and a variety of biochemical parameters. This study applied machine learning algorithms to predict the prognosis of DIC after gastrointestinal surgery. The results of this study are expected to be used as an indicator for evaluating patient prognosis so that it can increase life expectancy and reduce mortality from cases of DIC patients after gastrointestinal surgery.Keywords: the survival rate, gastrointestinal surgery, JAAM score, neural network, machine learning, disseminated intravascular coagulation (DIC)
Procedia PDF Downloads 260948 The Concentration of Selected Cosmogenic and Anthropogenic Radionuclides in the Ground Layer of the Atmosphere (Polar and Mid-Latitudes Regions)
Authors: A. Burakowska, M. Piotrowski, M. Kubicki, H. Trzaskowska, R. Sosnowiec, B. Myslek-Laurikainen
Abstract:
The most important source of atmospheric radioactivity are radionuclides generated as a result of the impact of primary and secondary cosmic radiation, with the nuclei of nitrogen oxygen and carbon in the upper troposphere and lower stratosphere. This creates about thirty radioisotopes of more than twenty elements. For organisms, the four of them are most important: ³H, ⁷Be, ²²Na, ¹⁴C. The natural radionuclides, which are present in Earth crust, also settle on dust and particles of water vapor. By this means, the derivatives of uranium and thorium, and long-life 40K get into the air. ¹³⁷Cs is the most widespread isotope, that is implemented by humans into the environment. To determine the concentration of radionuclides in the atmosphere, high volume air samplers were used, where the aerosol collection took place on a special filter fabric (Petrianov filter tissue FPP-15-1.5). In 2002 the high volume air sampler AZA-1000 was installed at the Polish Polar Observatory of the Polish Academy of Science in Hornsund, Spitsbergen (77°00’N, 15°33’E), designed to operate in all weather conditions of the cold polar region. Since 1991 (with short breaks) the ASS-500 air sampler has been working, which is located in Swider at the Kalinowski Geophysical Observatory of Geophysics Institute of the Polish Academy of Science (52°07’N, 21°15’E). The following results of radionuclides concentrations were obtained from both stations using gamma spectroscopy analysis: ⁷Be, ¹³⁷Cs, ¹³⁴Cs, ²¹⁰Pb, ⁴⁰K. For gamma spectroscopy analysis HPGe (High Purity Germanium) detector were used. These data were compared with each other. The preliminary results gave evidence that radioactivity measured in aerosols is not proportional to the amount of dust for both studied regions. Furthermore, the results indicate annual variability (seasonal fluctuations) as well as a decrease in the average activity of ⁷Be with increasing latitude. The content of ⁷Be in surface air also indicates the relationship with solar activity cycles.Keywords: aerosols, air filters, atmospheric beryllium, environmental radionuclides, gamma spectroscopy, mid-latitude regions radionuclides, polar regions radionuclides, solar cycles
Procedia PDF Downloads 141947 A Study of Transferable Strategies in Multilanguage Learning
Authors: Zixi You
Abstract:
With the demand of multilingual speakers increasing in the job market, multi-language learning programs have become more and more popular among undergraduate students. A study on multi-language learning strategies is therefore highly demanded on both practical and theoretical levels. Based on previous classification of learning strategies in SLA, and an investigation of BA Modern Language program students (with post-A level L2 and ab initio L3 learning experience from year one), this study explores and compares different types of learning strategies used by multi-language speakers and learners, transferable learning strategies between L2 and L3, and factors affecting the transfer. The results indicate that all the 23 types of learning strategies of L2 are employed when learning L3 from ab initio level, yet with different tendencies. Learning strategy transfer from L2 to L3 (i.e., the learners attribute the applying of these L3 learning strategies to be a direct result of their L2 learning experience) are observed in all 23 types of learning strategies. Comparatively, six types of “cognitive strategies” have higher transfer tendency than others. With regard to the failure of the transfer of some particular L2 strategies and the development of independent L3 strategies of individual learners, factors such as language proficiency, language typology and learning environment have played important roles among others. The presentation of this study will provide audiences with detailed data, insightful analysis and discussion on both theoretical and practical aspects of multi-language learning that will benefit both students and educators.Keywords: learning strategy, multi-language acquisition, second language acquisition, strategy transfer
Procedia PDF Downloads 575946 Revising Our Ideas on Revisions: Non-Contact Bridging Plate Fixation of Vancouver B1 and B2 Periprosthetic Femoral Fractures
Authors: S. Ayeko, J. Milton, C. Hughes, K. Anderson, R. G. Middleton
Abstract:
Background: Periprosthetic femoral fractures (PFF) in association with hip hemiarthroplasty or total hip arthroplasty is a common and serious complication. In the Vancouver Classification system algorithm, B1 fractures should be treated with Open Reduction and Internal Fixation (ORIF) and preferentially revised in combination with ORIF if B2 or B3. This study aims to assess patient outcomes after plate osteosynthesis alone for Vancouver B1 and B2 fractures. The main outcome is the 1-year re-revision rate, and secondary outcomes are 30-day and 1-year mortality. Method: This is a retrospective single-centre case-series review from January 2016 to June 2021. Vancouver B1 and B2, non-malignancy fractures in adults over 18 years of age treated with polyaxial Non-Contact Bridging plate osteosynthesis, have been included. Outcomes were gathered from electronic notes and radiographs. Results: There were 50 B1 and 64 B2 fractures. 26 B2 fractures were managed with ORIF and revision, 39 ORIF alone. Of the revision group, one died within 30 days (3.8%), one at one year (3.8%), and two were revised within one year (7.7). Of the B2 ORIF group, three died within 30-day mortality (7.96%), eight at one year (21.1%), and 0 were revised in 1 year. Conclusion: This study has demonstrated that satisfactory outcomes can be achieved with ORIF, excluding revision in the management of B2 fractures.Keywords: arthroplasty, bridging plate, periprosthetic fracture, revision surgery
Procedia PDF Downloads 101945 Artificial Neural Network Based Model for Detecting Attacks in Smart Grid Cloud
Authors: Sandeep Mehmi, Harsh Verma, A. L. Sangal
Abstract:
Ever since the idea of using computing services as commodity that can be delivered like other utilities e.g. electric and telephone has been floated, the scientific fraternity has diverted their research towards a new area called utility computing. New paradigms like cluster computing and grid computing came into existence while edging closer to utility computing. With the advent of internet the demand of anytime, anywhere access of the resources that could be provisioned dynamically as a service, gave rise to the next generation computing paradigm known as cloud computing. Today, cloud computing has become one of the most aggressively growing computer paradigm, resulting in growing rate of applications in area of IT outsourcing. Besides catering the computational and storage demands, cloud computing has economically benefitted almost all the fields, education, research, entertainment, medical, banking, military operations, weather forecasting, business and finance to name a few. Smart grid is another discipline that direly needs to be benefitted from the cloud computing advantages. Smart grid system is a new technology that has revolutionized the power sector by automating the transmission and distribution system and integration of smart devices. Cloud based smart grid can fulfill the storage requirement of unstructured and uncorrelated data generated by smart sensors as well as computational needs for self-healing, load balancing and demand response features. But, security issues such as confidentiality, integrity, availability, accountability and privacy need to be resolved for the development of smart grid cloud. In recent years, a number of intrusion prevention techniques have been proposed in the cloud, but hackers/intruders still manage to bypass the security of the cloud. Therefore, precise intrusion detection systems need to be developed in order to secure the critical information infrastructure like smart grid cloud. Considering the success of artificial neural networks in building robust intrusion detection, this research proposes an artificial neural network based model for detecting attacks in smart grid cloud.Keywords: artificial neural networks, cloud computing, intrusion detection systems, security issues, smart grid
Procedia PDF Downloads 318944 Single Imputation for Audiograms
Authors: Sarah Beaver, Renee Bryce
Abstract:
Audiograms detect hearing impairment, but missing values pose problems. This work explores imputations in an attempt to improve accuracy. This work implements Linear Regression, Lasso, Linear Support Vector Regression, Bayesian Ridge, K Nearest Neighbors (KNN), and Random Forest machine learning techniques to impute audiogram frequencies ranging from 125Hz to 8000Hz. The data contains patients who had or were candidates for cochlear implants. Accuracy is compared across two different Nested Cross-Validation k values. Over 4000 audiograms were used from 800 unique patients. Additionally, training on data combines and compares left and right ear audiograms versus single ear side audiograms. The accuracy achieved using Root Mean Square Error (RMSE) values for the best models for Random Forest ranges from 4.74 to 6.37. The R\textsuperscript{2} values for the best models for Random Forest ranges from .91 to .96. The accuracy achieved using RMSE values for the best models for KNN ranges from 5.00 to 7.72. The R\textsuperscript{2} values for the best models for KNN ranges from .89 to .95. The best imputation models received R\textsuperscript{2} between .89 to .96 and RMSE values less than 8dB. We also show that the accuracy of classification predictive models performed better with our best imputation models versus constant imputations by a two percent increase.Keywords: machine learning, audiograms, data imputations, single imputations
Procedia PDF Downloads 82943 Convolutional Neural Network and LSTM Applied to Abnormal Behaviour Detection from Highway Footage
Authors: Rafael Marinho de Andrade, Elcio Hideti Shiguemori, Rafael Duarte Coelho dos Santos
Abstract:
Relying on computer vision, many clever things are possible in order to make the world safer and optimized on resource management, especially considering time and attention as manageable resources, once the modern world is very abundant in cameras from inside our pockets to above our heads while crossing the streets. Thus, automated solutions based on computer vision techniques to detect, react, or even prevent relevant events such as robbery, car crashes and traffic jams can be accomplished and implemented for the sake of both logistical and surveillance improvements. In this paper, we present an approach for vehicles’ abnormal behaviors detection from highway footages, in which the vectorial data of the vehicles’ displacement are extracted directly from surveillance cameras footage through object detection and tracking with a deep convolutional neural network and inserted into a long-short term memory neural network for behavior classification. The results show that the classifications of behaviors are consistent and the same principles may be applied to other trackable objects and scenarios as well.Keywords: artificial intelligence, behavior detection, computer vision, convolutional neural networks, LSTM, highway footage
Procedia PDF Downloads 166942 Recovery of Metals from Electronic Waste by Physical and Chemical Recycling Processes
Authors: Muammer Kaya
Abstract:
The main purpose of this article is to provide a comprehensive review of various physical and chemical processes for electronic waste (e-waste) recycling, their advantages and shortfalls towards achieving a cleaner process of waste utilization, with especial attention towards extraction of metallic values. Current status and future perspectives of waste printed circuit boards (PCBs) recycling are described. E-waste characterization, dismantling/ disassembly methods, liberation and classification processes, composition determination techniques are covered. Manual selective dismantling and metal-nonmetal liberation at – 150 µm at two step crushing are found to be the best. After size reduction, mainly physical separation/concentration processes employing gravity, electrostatic, magnetic separators, froth floatation etc., which are commonly used in mineral processing, have been critically reviewed here for separation of metals and non-metals, along with useful utilizations of the non-metallic materials. The recovery of metals from e-waste material after physical separation through pyrometallurgical, hydrometallurgical or biohydrometallurgical routes is also discussed along with purification and refining and some suitable flowsheets are also given. It seems that hydrometallurgical route will be a key player in the base and precious metals recoveries from e-waste. E-waste recycling will be a very important sector in the near future from economic and environmental perspectives.Keywords: e-waste, WEEE, recycling, metal recovery, hydrometallurgy, pirometallurgy, biometallurgy
Procedia PDF Downloads 356941 Evaluating Radiative Feedback Mechanisms in Coastal West Africa Using Regional Climate Models
Authors: Akinnubi Rufus Temidayo
Abstract:
Coastal West Africa is highly sensitive to climate variability, driven by complex ocean-atmosphere interactions that shape temperature, precipitation, and extreme weather. Radiative feedback mechanisms—such as water vapor feedback, cloud-radiation interactions, and surface albedo—play a critical role in modulating these patterns. Yet, limited research addresses these feedbacks in climate models specific to West Africa’s coastal zones, creating challenges for accurate climate projections and adaptive planning. This study aims to evaluate the influence of radiative feedbacks on the coastal climate of West Africa by quantifying the effects of water vapor, cloud cover, and sea surface temperature (SST) on the region’s radiative balance. The study uses a regional climate model (RCM) to simulate feedbacks over a 20-year period (2005-2025) with high-resolution data from CORDEX and satellite observations. Key mechanisms investigated include (1) Water Vapor Feedback—the amplifying effect of humidity on warming, (2) Cloud-Radiation Interactions—the impact of cloud cover on radiation balance, especially during the West African Monsoon, and (3) Surface Albedo and Land-Use Changes—effects of urbanization and vegetation on the radiation budget. Preliminary results indicate that radiative feedbacks strongly influence seasonal climate variability in coastal West Africa. Water vapor feedback amplifies dry-season warming, cloud-radiation interactions moderate surface temperatures during monsoon seasons, and SST variations in the Atlantic affect the frequency and intensity of extreme rainfall events. The findings suggest that incorporating these feedbacks into climate planning can strengthen resilience to climate impacts in West African coastal communities. Further research should refine regional models to capture anthropogenic influences like greenhouse gas emissions, guiding sustainable urban and resource planning to mitigate climate risks.Keywords: west africa, radiative, climate, resilence, anthropogenic
Procedia PDF Downloads 10940 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality
Authors: Sirilak Areerachakul
Abstract:
Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.Keywords: artificial neural network, geographic information system, water quality, computer science
Procedia PDF Downloads 343