Search results for: estimation algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3747

Search results for: estimation algorithms

1017 The Effect Analysis of Monetary Instruments through Islamic Banking Financing Channel toward Economic Growth in Indonesia, Period January 2008-December 2015

Authors: Sobar M. Johari, Ida Putri Anjarsari

Abstract:

In the transmission of monetary instrument towards real sector of the economy, Bank Indonesia as monetary authority has developed Islamic Bank Indonesia Certificate (abbreviated as SBIS) as an instrument in Islamic open market operation. One of the monetary transmission channels could take place through financing channel from which the fund is used as the source of banking financing. This study aims to analyse the impact of Islamic monetary instrument towards output or economic growth. Data used in this research is taken from Bank Indonesia and Central Board of Statistics for the period of January 2008 until December 2015. The study employs Granger Causality Test, Vector Error Correction Model (VECM), Impulse Response Function (IRF) technique and Forecast Error Variance Decomposition (FEVD) as its analytical methods. The results show that, first, the transmission mechanism of banking financing channel are not linked to output. Second, estimation results of VECM show that SBIS, PUAS, and FIN have significant impact in the long term towards output. When there is monetary shock, output or economic growth could be recovered and stabilized in the short term. FEVD results show that Islamic banking financing contributes 1.33 percent to increase economic growth.

Keywords: Islamic monetary instrument, Islamic banking financing channel, economic growth, Vector Error Correction Model (VECM)

Procedia PDF Downloads 262
1016 Performance Evaluation of a Small Microturbine Cogeneration Functional Model

Authors: Jeni A. Popescu, Sorin G. Tomescu, Valeriu A. Vilag

Abstract:

The paper focuses on the potential methods of increasing the performance of a microturbine by combining additional elements available for utilization in a cogeneration plant. The activity is carried out within the framework of a project aiming to develop, manufacture and test a microturbine functional model with high potential in energetic industry utilization. The main goal of the analysis is to determine the parameters of the fluid flow passing through each section of the turbine, based on limited data available in literature for the focus output power range or provided by experimental studies, starting from a reference cycle, and considering different cycle options, including simple, intercooled and recuperated options, in order to optimize a small cogeneration plant operation. The studied configurations operate under the same initial thermodynamic conditions and are based on a series of assumptions, in terms of individual performance of the components, pressure/velocity losses, compression ratios, and efficiencies. The thermodynamic analysis evaluates the expected performance of the microturbine cycle, while providing a series of input data and limitations to be included in the development of the experimental plan. To simplify the calculations and to allow a clear estimation of the effect of heat transfer between fluids, the working fluid for all the thermodynamic evolutions is, initially, air, the combustion being modelled by simple heat addition to the system. The theoretical results, along with preliminary experimental results are presented, aiming for a correlation in terms of microturbine performance.

Keywords: cogeneration, microturbine, performance, thermodynamic analysis

Procedia PDF Downloads 158
1015 Hardware in the Loop Platform for Virtual Commissioning: Case Study of a Hydraulic-Press Model Simulated in Real-Time

Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Ana Maria Macarulla

Abstract:

Hydraulic-press commissioning consumes a great amount of man-hours, due to the fact that it takes place several miles away from where it has been designed. This factor became exacerbated due to control designers’ lack of knowledge about which will be the final controller gains before they start working with it. Virtual commissioning has been postulated as an optimal solution to deal with this lack of knowledge. Here, a case study is presented in which a controller is set up against a real-time model based on a hydraulic-press. The press model is designed following manufacturer specifications and it is embedded in a real-time simulator. This methodology ensures that the model achieves similar responses as the real machine that would be placed on the industry. A deterministic communication protocol is in charge of the bidirectional information transmission between the real-time model and the controller. This platform allows the engineer to test and verify the final control responses with exactly the same hardware that is going to be installed in the hydraulic-press, in other words, realize a virtual commissioning of the electro-hydraulic actuator. The Hardware in the Loop (HiL) platform validates in laboratory conditions and harmless for the machine the control algorithms designed, which allows embedding them afterwards in the industrial environment without further modifications.

Keywords: deterministic communication protocol, electro-hydraulic actuator, hardware in the loop, real-time, virtual commissioning

Procedia PDF Downloads 131
1014 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

Authors: U. Datta

Abstract:

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection

Procedia PDF Downloads 124
1013 Production Planning for Animal Food Industry under Demand Uncertainty

Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut

Abstract:

This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.

Keywords: animal food, stochastic linear programming, aggregate planning, production planning, demand uncertainty

Procedia PDF Downloads 367
1012 BeamGA Median: A Hybrid Heuristic Search Approach

Authors: Ghada Badr, Manar Hosny, Nuha Bintayyash, Eman Albilali, Souad Larabi Marie-Sainte

Abstract:

The median problem is significantly applied to derive the most reasonable rearrangement phylogenetic tree for many species. More specifically, the problem is concerned with finding a permutation that minimizes the sum of distances between itself and a set of three signed permutations. Genomes with equal number of genes but different order can be represented as permutations. In this paper, an algorithm, namely BeamGA median, is proposed that combines a heuristic search approach (local beam) as an initialization step to generate a number of solutions, and then a Genetic Algorithm (GA) is applied in order to refine the solutions, aiming to achieve a better median with the smallest possible reversal distance from the three original permutations. In this approach, any genome rearrangement distance can be applied. In this paper, we use the reversal distance. To the best of our knowledge, the proposed approach was not applied before for solving the median problem. Our approach considers true biological evolution scenario by applying the concept of common intervals during the GA optimization process. This allows us to imitate a true biological behavior and enhance genetic approach time convergence. We were able to handle permutations with a large number of genes, within an acceptable time performance and with same or better accuracy as compared to existing algorithms.

Keywords: median problem, phylogenetic tree, permutation, genetic algorithm, beam search, genome rearrangement distance

Procedia PDF Downloads 256
1011 Uniqueness and Repeatability Analysis for Slim Tube Determined Minimum Miscibility Pressure

Authors: Waqar Ahmad Butt, Gholamreza Vakili Nezhaad, Ali Soud Al Bemani, Yahya Al Wahaibi

Abstract:

Miscible gas injection processes as secondary recovery methods can be applied to a huge number of mature reservoirs to improve the trapped oil displacement. Successful miscible gas injection processes require an accurate estimation of the minimum miscibility pressure (MMP) to make injection process feasible, economical, and effective. There are several methods of MMP determination like slim tube approach, vanishing interfacial tension and rising bubble apparatus but slim tube is the deployed experimental technique in this study. Slim tube method is assumed to be non-standardized for MMP determination with respect to both operating procedure and design. Therefore, 25 slim tube runs were being conducted with three different coil lengths (12, 18 and 24 m) of constant diameter using three different injection rates (0.08, 0.1 and 0.15 cc/min) to evaluate uniqueness and repeatability of determined MMP. A trend of decrease in MMP with increase in coil length was found. No unique trend was found between MMP and injection rate. Lowest MMP and highest recovery were observed with highest coil length and lowest injection rate. It shows that slim tube measured MMP does not depend solely on interacting fluids characteristics but also affected by used coil selection and injection rate choice. Therefore, both slim tube design and procedure need to be standardized. It is recommended to use lowest possible injection rate and estimated coil length depending upon the distance between injections and producing wells for accurate and reliable MMP determination.

Keywords: coil length, injection rate, minimum miscibility pressure, multiple contacts miscibility

Procedia PDF Downloads 242
1010 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques

Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian

Abstract:

Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.

Keywords: data mining, k-means, road traffic accidents, Waze, Weka

Procedia PDF Downloads 398
1009 Charting Sentiments with Naive Bayes and Logistic Regression

Authors: Jummalla Aashrith, N. L. Shiva Sai, K. Bhavya Sri

Abstract:

The swift progress of web technology has not only amassed a vast reservoir of internet data but also triggered a substantial surge in data generation. The internet has metamorphosed into one of the dynamic hubs for online education, idea dissemination, as well as opinion-sharing. Notably, the widely utilized social networking platform Twitter is experiencing considerable expansion, providing users with the ability to share viewpoints, participate in discussions spanning diverse communities, and broadcast messages on a global scale. The upswing in online engagement has sparked a significant curiosity in subjective analysis, particularly when it comes to Twitter data. This research is committed to delving into sentiment analysis, focusing specifically on the realm of Twitter. It aims to offer valuable insights into deciphering information within tweets, where opinions manifest in a highly unstructured and diverse manner, spanning a spectrum from positivity to negativity, occasionally punctuated by neutrality expressions. Within this document, we offer a comprehensive exploration and comparative assessment of modern approaches to opinion mining. Employing a range of machine learning algorithms such as Naive Bayes and Logistic Regression, our investigation plunges into the domain of Twitter data streams. We delve into overarching challenges and applications inherent in the realm of subjectivity analysis over Twitter.

Keywords: machine learning, sentiment analysis, visualisation, python

Procedia PDF Downloads 42
1008 Infectivity of Hyalomma Ticks for Theileria annulata Using 18s rRNA PCR

Authors: Muhammad S. Sajid, A. Iqbal, A. Kausar, M. Jawad-ul-Hassan, Z. Iqbal, Hafiz M. Rizwan, M. Saqib

Abstract:

Among the ixodid ticks, species of genus Hyalomma are of prime importance as they can survive in harsh conditions better than those of other species. Similarly, among various tick-borne pathogens, Theileria (T.) annulata, the causative agent of tropical theileriosis in large ruminants, is responsible for reduced productivity and ultimately substantial economic losses due to morbidity and mortality. The present study was planned to screening of vector ticks through molecular techniques for determination of tick-borne theileriosis in district Toba Tek Singh (T. T. Singh), Punjab, Pakistan. For this purpose, among the collected ticks (n = 2252) from livestock and their microclimate, Hyalomma spp. were subjected to dissection for procurement of salivary glands (SGs) and formation of pool (averaged 8 acini in each pool). Each pool of acini was used for DNA extraction, quantification and primer-specific amplification of 18S rRNA of Theileria (T.) annulata. The amplicons were electrophoresed using 1.8% agarose gel following by imaging to identify the band specific for T. annulata. For confirmation, the positive amplicons were subjected to sequencing, BLAST analysis and homology search using NCBI software. The number of Theileria-infected acini was significantly higher (P < 0.05) in female ticks vs male ticks, infesting ticks vs questing ticks and riverine-collected vs non-riverine collected. The data provides first attempt to quantify the vectoral capacity of ixodid ticks in Pakistan for T. annulata which can be helpful in estimation of risk analysis of theileriosis to the domestic livestock population of the country.

Keywords: Hyalomma anatolicum, ixodids, PCR, Theileria annulata

Procedia PDF Downloads 276
1007 A Linear Regression Model for Estimating Anxiety Index Using Wide Area Frontal Lobe Brain Blood Volume

Authors: Takashi Kaburagi, Masashi Takenaka, Yosuke Kurihara, Takashi Matsumoto

Abstract:

Major depressive disorder (MDD) is one of the most common mental illnesses today. It is believed to be caused by a combination of several factors, including stress. Stress can be quantitatively evaluated using the State-Trait Anxiety Inventory (STAI), one of the best indices to evaluate anxiety. Although STAI scores are widely used in applications ranging from clinical diagnosis to basic research, the scores are calculated based on a self-reported questionnaire. An objective evaluation is required because the subject may intentionally change his/her answers if multiple tests are carried out. In this article, we present a modified index called the “multi-channel Laterality Index at Rest (mc-LIR)” by recording the brain activity from a wider area of the frontal lobe using multi-channel functional near-infrared spectroscopy (fNIRS). The presented index aims to measure multiple positions near the Fpz defined by the international 10-20 system positioning. Using 24 subjects, the dependencies on the number of measuring points used to calculate the mc-LIR and its correlation coefficients with the STAI scores are reported. Furthermore, a simple linear regression was performed to estimate the STAI scores from mc-LIR. The cross-validation error is also reported. The experimental results show that using multiple positions near the Fpz will improve the correlation coefficients and estimation than those using only two positions.

Keywords: frontal lobe, functional near-infrared spectroscopy, state-trait anxiety inventory score, stress

Procedia PDF Downloads 240
1006 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia

Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui

Abstract:

This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.

Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia

Procedia PDF Downloads 388
1005 A Clustering Algorithm for Massive Texts

Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen

Abstract:

Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.

Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process

Procedia PDF Downloads 417
1004 Nonlinear Analysis in Investigating the Complexity of Neurophysiological Data during Reflex Behavior

Authors: Juliana A. Knocikova

Abstract:

Methods of nonlinear signal analysis are based on finding that random behavior can arise in deterministic nonlinear systems with a few degrees of freedom. Considering the dynamical systems, entropy is usually understood as a rate of information production. Changes in temporal dynamics of physiological data are indicating evolving of system in time, thus a level of new signal pattern generation. During last decades, many algorithms were introduced to assess some patterns of physiological responses to external stimulus. However, the reflex responses are usually characterized by short periods of time. This characteristic represents a great limitation for usual methods of nonlinear analysis. To solve the problems of short recordings, parameter of approximate entropy has been introduced as a measure of system complexity. Low value of this parameter is reflecting regularity and predictability in analyzed time series. On the other side, increasing of this parameter means unpredictability and a random behavior, hence a higher system complexity. Reduced neurophysiological data complexity has been observed repeatedly when analyzing electroneurogram and electromyogram activities during defence reflex responses. Quantitative phrenic neurogram changes are also obvious during severe hypoxia, as well as during airway reflex episodes. Concluding, the approximate entropy parameter serves as a convenient tool for analysis of reflex behavior characterized by short lasting time series.

Keywords: approximate entropy, neurophysiological data, nonlinear dynamics, reflex

Procedia PDF Downloads 293
1003 Factors Affecting Internet Behavior and Life Satisfaction of Older Adult Learners with Use of Smartphone

Authors: Horng-Ji Lai

Abstract:

The intuitive design features and friendly interface of smartphone attract older adults. In Taiwan, many senior education institutes offer smartphone training courses for older adult learners who are interested in learning this innovative technology. It is expected that the training courses can help them to enjoy the benefits of using smartphone and increase their life satisfaction. Therefore, it is important to investigate the factors that influence older adults’ behavior of using smartphone. The purpose of the research was to develop and test a research model that investigates the factors (self-efficacy, social connection, the need to seek health information, and the need to seek financial information) affecting older adult learners’ Internet behaviour and their life satisfaction with use of smartphone. Also, this research sought to identify the relationship between the proposed variables. Survey method was used to collect research data. A Structural Equation Modeling was performed using Partial Least Squares (PLS) regression for data exploration and model estimation. The participants were 394 older adult learners from smartphone training courses in active aging learning centers located in central Taiwan. The research results revealed that self-efficacy significantly affected older adult learner’ social connection, the need to seek health information, and the need to seek financial information. The construct of social connection yielded a positive influence in respondents’ life satisfaction. The implications of these results for practice and future research are also discussed.

Keywords: older adults, smartphone, internet behaviour, life satisfaction

Procedia PDF Downloads 178
1002 Seismic Vulnerability Assessment of Masonry Buildings in Seismic Prone Regions: The Case of Annaba City, Algeria

Authors: Allaeddine Athmani, Abdelhacine Gouasmia, Tiago Ferreira, Romeu Vicente

Abstract:

Seismic vulnerability assessment of masonry buildings is a fundamental issue even for moderate to low seismic hazard regions. This fact is even more important when dealing with old structures such as those located in Annaba city (Algeria), which the majority of dates back to the French colonial era from 1830. This category of buildings is in high risk due to their highly degradation state, heterogeneous materials and intrusive modifications to structural and non-structural elements. Furthermore, they are usually shelter a dense population, which is exposed to such risk. In order to undertake a suitable seismic risk mitigation strategies and reinforcement process for such structures, it is essential to estimate their seismic resistance capacity at a large scale. In this sense, two seismic vulnerability index methods and damage estimation have been adapted and applied to a pilot-scale building area located in the moderate seismic hazard region of Annaba city: The first one based on the EMS-98 building typologies, and the second one derived from the Italian GNDT approach. To perform this task, the authors took the advantage of an existing data survey previously performed for other purposes. The results obtained from the application of the two methods were integrated and compared using a geographic information system tool (GIS), with the ultimate goal of supporting the city council of Annaba for the implementation of risk mitigation and emergency planning strategies.

Keywords: Annaba city, EMS98 concept, GNDT method, old city center, seismic vulnerability index, unreinforced masonry buildings

Procedia PDF Downloads 611
1001 Peculiarities of the Clinical Course of the Osteoarthritis in Shift-Workers: Analysis of Clinical Data and Questionnaries

Authors: Oksana Mykytyuk

Abstract:

Chronic desynchronosis is an important factor of progression of osteoarthritis in shift workers. 80 patients with primary osteoarthritis (female:male ratio = 3:1, average age: 57.6 years, average disease duration: 6.4 years, radiological stage: II-III) were examined, 42% reported systematic night shift-work for more than two years. Full clinical examination was performed, all patients filled in SF-36, WOMAC questonnaries, marked visual analog scales for estimation of pain intensity and general well-being. Patients who had been exposed to night work had significantly worse clinical course of osteoarthritis marked by more (27.5%, p < 0.05) extensive pain syndrome, especially at night hours, (10.00 pm-2.00 am period) and estimated life quality as poorer comparing those working at day time. Osteoarthritis initiation occurred at earlier age in them comparing those who worked in non-shifted regimen. They showed a trend to generalized affliction of bigger quantity of joint groups, higher frequency of synovitis as well. Shift-workers administered higher doses of non-steroid anti-inflammatory drugs (NSAIDs) and estimated their effect as lower (39.6% average daily relief vs 62.5% in non-shift workers after 10 days of regular application of therapy). Frequency of chronic NSAID-induced gastropathy was 25% higher among night-workers. Shift-workers are predisposed to worse course of osteoarthritis with marked clinical symptoms, requiring higher doses on NSAIDs and with inclination towards bigger frequency of complication. That should be kept in mind while developing individual treatment and secondary prophylaxis strategy.

Keywords: desynchronosis, osteoarthritis, questionnaries, shift-work

Procedia PDF Downloads 119
1000 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: missing values, incomplete data, distance, incomplete diabetes data

Procedia PDF Downloads 208
999 Infodemic Detection on Social Media with a Multi-Dimensional Deep Learning Framework

Authors: Raymond Xu, Cindy Jingru Wang

Abstract:

Social media has become a globally connected and influencing platform. Social media data, such as tweets, can help predict the spread of pandemics and provide individuals and healthcare providers early warnings. Public psychological reactions and opinions can be efficiently monitored by AI models on the progression of dominant topics on Twitter. However, statistics show that as the coronavirus spreads, so does an infodemic of misinformation due to pandemic-related factors such as unemployment and lockdowns. Social media algorithms are often biased toward outrage by promoting content that people have an emotional reaction to and are likely to engage with. This can influence users’ attitudes and cause confusion. Therefore, social media is a double-edged sword. Combating fake news and biased content has become one of the essential tasks. This research analyzes the variety of methods used for fake news detection covering random forest, logistic regression, support vector machines, decision tree, naive Bayes, BoW, TF-IDF, LDA, CNN, RNN, LSTM, DeepFake, and hierarchical attention network. The performance of each method is analyzed. Based on these models’ achievements and limitations, a multi-dimensional AI framework is proposed to achieve higher accuracy in infodemic detection, especially pandemic-related news. The model is trained on contextual content, images, and news metadata.

Keywords: artificial intelligence, fake news detection, infodemic detection, image recognition, sentiment analysis

Procedia PDF Downloads 230
998 Workforce Optimization: Fair Workload Balance and Near-Optimal Task Execution Order

Authors: Alvaro Javier Ortega

Abstract:

A large number of companies face the challenge of matching highly-skilled professionals to high-end positions by human resource deployment professionals. However, when the professional list and tasks to be matched are larger than a few dozens, this process result is far from optimal and takes a long time to be made. Therefore, an automated assignment algorithm for this workforce management problem is needed. The majority of companies are divided into several sectors or departments, where trained employees with different experience levels deal with a large number of tasks daily. Also, the execution order of all tasks is of mater consequence, due to some of these tasks just can be run it if the result of another task is provided. Thus, a wrong execution order leads to large waiting times between consecutive tasks. The desired goal is, therefore, creating accurate matches and a near-optimal execution order that maximizes the number of tasks performed and minimizes the idle time of the expensive skilled employees. The problem described before can be model as a mixed-integer non-linear programming (MINLP) as it will be shown in detail through this paper. A large number of MINLP algorithms have been proposed in the literature. Here, genetic algorithm solutions are considered and a comparison between two different mutation approaches is presented. The simulated results considering different complexity levels of assignment decisions show the appropriateness of the proposed model.

Keywords: employees, genetic algorithm, industry management, workforce

Procedia PDF Downloads 158
997 Estimation of Effective Radiation Dose Following Computed Tomography Urography at Aminu Kano Teaching Hospital, Kano Nigeria

Authors: Idris Garba, Aisha Rabiu Abdullahi, Mansur Yahuza, Akintade Dare

Abstract:

Background: CT urography (CTU) is efficient radiological examination for the evaluation of the urinary system disorders. However, patients are exposed to a significant radiation dose which is in a way associated with increased cancer risks. Objectives: To determine Computed Tomography Dose Index following CTU, and to evaluate organs equivalent doses. Materials and Methods: A prospective cohort study was carried at a tertiary institution located in Kano northwestern. Ethical clearance was sought and obtained from the research ethics board of the institution. Demographic, scan parameters and CT radiation dose data were obtained from patients that had CTU procedure. Effective dose, organ equivalent doses, and cancer risks were estimated using SPSS statistical software version 16 and CT dose calculator software. Result: A total of 56 patients were included in the study, consisting of 29 males and 27 females. The common indication for CTU examination was found to be renal cyst seen commonly among young adults (15-44yrs). CT radiation dose values in DLP, CTDI and effective dose for CTU were 2320 mGy cm, CTDIw 9.67 mGy and 35.04 mSv respectively. The probability of cancer risks was estimated to be 600 per a million CTU examinations. Conclusion: In this study, the radiation dose for CTU is considered significantly high, with increase in cancer risks probability. Wide radiation dose variations between patient doses suggest that optimization is not fulfilled yet. Patient radiation dose estimate should be taken into consideration when imaging protocols are established for CT urography.

Keywords: CT urography, cancer risks, effective dose, radiation exposure

Procedia PDF Downloads 329
996 Vision-Based Daily Routine Recognition for Healthcare with Transfer Learning

Authors: Bruce X. B. Yu, Yan Liu, Keith C. C. Chan

Abstract:

We propose to record Activities of Daily Living (ADLs) of elderly people using a vision-based system so as to provide better assistive and personalization technologies. Current ADL-related research is based on data collected with help from non-elderly subjects in laboratory environments and the activities performed are predetermined for the sole purpose of data collection. To obtain more realistic datasets for the application, we recorded ADLs for the elderly with data collected from real-world environment involving real elderly subjects. Motivated by the need to collect data for more effective research related to elderly care, we chose to collect data in the room of an elderly person. Specifically, we installed Kinect, a vision-based sensor on the ceiling, to capture the activities that the elderly subject performs in the morning every day. Based on the data, we identified 12 morning activities that the elderly person performs daily. To recognize these activities, we created a HARELCARE framework to investigate into the effectiveness of existing Human Activity Recognition (HAR) algorithms and propose the use of a transfer learning algorithm for HAR. We compared the performance, in terms of accuracy, and training progress. Although the collected dataset is relatively small, the proposed algorithm has a good potential to be applied to all daily routine activities for healthcare purposes such as evidence-based diagnosis and treatment.

Keywords: daily activity recognition, healthcare, IoT sensors, transfer learning

Procedia PDF Downloads 123
995 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection

Authors: Muhammad Ali

Abstract:

Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.

Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection

Procedia PDF Downloads 107
994 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 282
993 Wood Energy in Bangladesh: An Overview of Status, Challenges and Development

Authors: Md. Kamrul Hassan, Ari Pappinen

Abstract:

Wood energy is the single most important form of renewable energy in many parts of the world especially in the least developing countries in South Asia like Bangladesh. The last portion of the national population of this country depends on wood energy for their daily primary energy need. This paper deals with the estimation of wood fuel at the current level and identifies the challenges and strategies related to the development of this resource. Desk research, interactive research and field survey were conducted for gathering and analyzing of data for this study. The study revealed that wood fuel plays a significant role in total primary energy supply in Bangladesh, and the contribution of wood fuel in final energy consumption in 2013 was about 24%. Trees on homestead areas, secondary plantation on off forest lands, and forests are the main sources of supplying wood fuel in the country. Insufficient supply of wood fuel against high upward demand is the main cause of concern for sustainable consumption, which eventually leads deterioration and depletion of the resources. Inadequate afforestation programme, lack of initiatives towards the utilization of set-aside lands for wood energy plantations, and inefficient management of the existing resources have been identified as the major impediments to the development of wood energy in Bangladesh. The study argued that enhancement of public-private-partnership afforestation programmes, intensifying the waste and marginal lands with short-rotation tree species, and formulation of biomass-based rural energy strategies at the regional level are relevant to the promotion of sustainable wood energy in the country.

Keywords: Bangladesh, challenge, supply, wood energy

Procedia PDF Downloads 179
992 Using Interval Type-2 Fuzzy Controller for Diabetes Mellitus

Authors: Nafiseh Mollaei, Reihaneh Kardehi Moghaddam

Abstract:

In case of Diabetes Mellitus the controlling of insulin is very difficult. This illness is an incurable disease affecting millions of people worldwide. Glucose is a sugar which provides energy to the cells. Insulin is a hormone which supports the absorption of glucose. Fuzzy control strategy is attractive for glucose control because it mimics the first and second phase responses that the pancreas beta cells use to control glucose. We propose two control algorithms a type-1 fuzzy controller and an interval type-2 fuzzy method for the insulin infusion. The closed loop system has been simulated for different patients with different parameters, in present of the food intake disturbance and it has been shown that the blood glucose concentrations at a normoglycemic level of 110 mg/dl in the reasonable amount of time. This paper deals with type 1 diabetes as a nonlinear model, which has been simulated in MATLAB-SIMULINK environment. The novel model, termed the Augmented Minimal Model is used in the simulations. There are some uncertainties in this model due to factors such as blood glucose, daily meals or sudden stress. In addition to eliminate the effects of uncertainty, different control methods may be utilized. In this article, fuzzy controller performance were assessed in terms of its ability to track a normoglycemic set point (110 mg/dl) in response to a [0-10] g meal disturbance. Finally, the development reported in this paper is supposed to simplify the insulin delivery, so increasing the quality of life of the patient.

Keywords: interval type-2, fuzzy controller, minimal augmented model, uncertainty

Procedia PDF Downloads 419
991 Optimization of a Hand-Fan Shaped Microstrip Patch Antenna by Means of Orthogonal Design Method of Design of Experiments for L-Band and S-Band Applications

Authors: Jaswinder Kaur, Nitika, Navneet Kaur, Rajesh Khanna

Abstract:

A hand-fan shaped microstrip patch antenna (MPA) for L-band and S-band applications is designed, and its characteristics have been reconnoitered. The proposed microstrip patch antenna with double U-slot defected ground structure (DGS) is fabricated on an FR4 substrate which is a very readily available and inexpensive material. The suggested antenna is optimized using Orthogonal Design Method (ODM) of Design of Experiments (DOE) to cover the frequency range from 0.91-2.82 GHz for L-band and S-band applications. The L-band covers the frequency range of 1-2 GHz, which is allocated to telemetry, aeronautical, and military systems for passive satellite sensors, weather radars, radio astronomy, and mobile communication. The S-band covers the frequency range of 2-3 GHz, which is used by weather radars, surface ship radars and communication satellites and is also reserved for various wireless applications such as Worldwide Interoperability for Microwave Access (Wi-MAX), super high frequency radio frequency identification (SHF RFID), industrial, scientific and medical bands (ISM), Bluetooth, wireless broadband (Wi-Bro) and wireless local area network (WLAN). The proposed method of optimization is very time efficient and accurate as compared to the conventional evolutionary algorithms due to its statistical strategy. Moreover, the antenna is tested, followed by the comparison of simulated and measured results.

Keywords: design of experiments, hand fan shaped MPA, L-Band, orthogonal design method, S-Band

Procedia PDF Downloads 122
990 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 361
989 Opacity Synthesis with Orwellian Observers

Authors: Moez Yeddes

Abstract:

The property of opacity is widely used in the formal verification of security in computer systems and protocols. Opacity is a general language-theoretic scheme of many security properties of systems. Opacity is parametrized with framework in which several security properties of a system can be expressed. A secret behaviour of a system is opaque if a passive attacker can never deduce its occurrence from the system observation. Instead of considering the case of static observability where the set of observable events is fixed off-line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we introduce Orwellian partial observability where unobservable events are not revealed provided that downgrading events never occurs in the future of the trace. Orwellian partial observability is needed to model intransitive information flow. This Orwellian observability is knwon as ipurge function. We show in previous work how to verify opacity for regular secret is opaque for a regular language L w.r.t. an Orwellian projection is PSPACE-complete while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. In this paper, we address two problems of opacification of a regular secret ϕ for a regular language L w.r.t. an Orwellian projection: Given L and a secret ϕ ∈ L, the first problem consist to compute some minimal regular super-language M of L, if it exists, such that ϕ is opaque for M and the second consists to compute the supremal sub-language M′ of L such that ϕ is opaque for M′. We derive both language-theoretic characterizations and algorithms to solve these two dual problems.

Keywords: security policies, opacity, formal verification, orwellian observation

Procedia PDF Downloads 215
988 Grey Wolf Optimization Technique for Predictive Analysis of Products in E-Commerce: An Adaptive Approach

Authors: Shital Suresh Borse, Vijayalaxmi Kadroli

Abstract:

E-commerce industries nowadays implement the latest AI, ML Techniques to improve their own performance and prediction accuracy. This helps to gain a huge profit from the online market. Ant Colony Optimization, Genetic algorithm, Particle Swarm Optimization, Neural Network & GWO help many e-commerce industries for up-gradation of their predictive performance. These algorithms are providing optimum results in various applications, such as stock price prediction, prediction of drug-target interaction & user ratings of similar products in e-commerce sites, etc. In this study, customer reviews will play an important role in prediction analysis. People showing much interest in buying a lot of services& products suggested by other customers. This ultimately increases net profit. In this work, a convolution neural network (CNN) is proposed which further is useful to optimize the prediction accuracy of an e-commerce website. This method shows that CNN is used to optimize hyperparameters of GWO algorithm using an appropriate coding scheme. Accurate model results are verified by comparing them to PSO results whose hyperparameters have been optimized by CNN in Amazon's customer review dataset. Here, experimental outcome proves that this proposed system using the GWO algorithm achieves superior execution in terms of accuracy, precision, recovery, etc. in prediction analysis compared to the existing systems.

Keywords: prediction analysis, e-commerce, machine learning, grey wolf optimization, particle swarm optimization, CNN

Procedia PDF Downloads 101