Search results for: missing data estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26521

Search results for: missing data estimation

25081 Combined Orthodontic and Restorative Management of Complex Cases: Concepts and Case Reports

Authors: Awais Ali, Hesham Ali

Abstract:

The absence of teeth through either premature loss or developmental absence is a common condition with potentially severe impact on affected individuals. Management of these cases presents a clinical challenge which may be difficult to resolve given the effects of tooth loss or hypodontia over the course of a patient’s lifetime. Treatment of such cases is often best provided by a multi-disciplinary team, where the patient’s expectations and care delivery can be optimally managed. Orthodontic treatment is often used to prepare the dentition in advance of restorative replacement of missing teeth. Conversely, the placement of implants may precede the delivery of orthodontic treatment and indeed may function as an adjunctive orthodontic procedure. We discuss the use of both approaches here and illustrate their clinical implementation with two case reports. The first case demonstrates the use of fixed appliances to prepare the mouth for an opposing implant-retained complete denture. A second case demonstrates the use of implant-retained crowns to provide orthodontic anchorage in a partially dentate patient. We propose that complex cases such as these should always be planned and treated by a multi-disciplinary team in order to optimise the delivery of care, patient experience, and treatment outcome. The presented cases add to the body of evidence in this area.

Keywords: orthodontics, dental implantology, hypodontia, multi-disciplinary

Procedia PDF Downloads 131
25080 A Semi-Markov Chain-Based Model for the Prediction of Deterioration of Concrete Bridges in Quebec

Authors: Eslam Mohammed Abdelkader, Mohamed Marzouk, Tarek Zayed

Abstract:

Infrastructure systems are crucial to every aspect of life on Earth. Existing Infrastructure is subjected to degradation while the demands are growing for a better infrastructure system in response to the high standards of safety, health, population growth, and environmental protection. Bridges play a crucial role in urban transportation networks. Moreover, they are subjected to high level of deterioration because of the variable traffic loading, extreme weather conditions, cycles of freeze and thaw, etc. The development of Bridge Management Systems (BMSs) has become a fundamental imperative nowadays especially in the large transportation networks due to the huge variance between the need for maintenance actions, and the available funds to perform such actions. Deterioration models represent a very important aspect for the effective use of BMSs. This paper presents a probabilistic time-based model that is capable of predicting the condition ratings of the concrete bridge decks along its service life. The deterioration process of the concrete bridge decks is modeled using semi-Markov process. One of the main challenges of the Markov Chain Decision Process (MCDP) is the construction of the transition probability matrix. Yet, the proposed model overcomes this issue by modeling the sojourn times based on some probability density functions. The sojourn times of each condition state are fitted to probability density functions based on some goodness of fit tests such as Kolmogorov-Smirnov test, Anderson Darling, and chi-squared test. The parameters of the probability density functions are obtained using maximum likelihood estimation (MLE). The condition ratings obtained from the Ministry of Transportation in Quebec (MTQ) are utilized as a database to construct the deterioration model. Finally, a comparison is conducted between the Markov Chain and semi-Markov chain to select the most feasible prediction model.

Keywords: bridge management system, bridge decks, deterioration model, Semi-Markov chain, sojourn times, maximum likelihood estimation

Procedia PDF Downloads 216
25079 Improvment Efficiency of Fitness Clubs Operation

Authors: E. V. Kuzmicheva

Abstract:

An attention is concentrated on a service quality estimation of sport services. A typical mathematical model was developed at the base of the «general economic theory of mass service» accounting pedagogical requirements of fitness services. Also it took into account a dependence of the club member number versus on a value of square of sport facilities. Final recommendations were applied to the fitness club resulted in some improvement of the quality sport service, an increasing of the revenue from club members and profit of clubs.

Keywords: fitness club, efficiency of operation, facilities, service quality, mass service

Procedia PDF Downloads 511
25078 Correlation Between Different Radiological Findings and Histopathological diagnosis of Breast Diseases: Retrospective Review Conducted Over Sixth Years in King Fahad University Hospital in Eastern Province, Saudi Arabia

Authors: Sadeem Aljamaan, Reem Hariri, Rahaf Alghamdi, Batool Alotaibi, Batool Alsenan, Lama Althunayyan, Areej Alnemer

Abstract:

The aim of this study is to correlate between radiological findings and histopathological results in regard to the breast imaging-reporting and data system scores, size of breast masses, molecular subtypes and suspicious radiological features, as well as to assess the concordance rate in histological grade between core biopsy and surgical excision among breast cancer patients, followed by analyzing the change of concordance rate in relation to neoadjuvant chemotherapy in a Saudi population. A retrospective review was conducted over 6-year period (2017-2022) on all breast core biopsies of women preceded by radiological investigation. Chi-squared test (χ2) was performed on qualitative data, the Mann-Whitney test for quantitative non-parametric variables, and the Kappa test for grade agreement. A total of 641 cases were included. Ultrasound, mammography, and magnetic resonance imaging demonstrated diagnostic accuracies of 85%, 77.9% and 86.9%; respectively. magnetic resonance imaging manifested the highest sensitivity (72.2%), and the lowest was for ultrasound (61%). Concordance in tumor size with final excisions was best in magnetic resonance imaging, while mammography demonstrated a higher tendency of overestimation (41.9%), and ultrasound showed the highest underestimation (67.7%). The association between basal-like molecular subtypes and the breast imaging-reporting and data system score 5 classifications was statistically significant only for magnetic resonance imaging (p=0.04). Luminal subtypes demonstrated a significantly higher percentage of speculation in mammography. Breast imaging-reporting and data system score 4 manifested a substantial number of benign pathologies in all the 3 modalities. A fair concordance rate (k= 0.212 & 0.379) was demonstrated between excision and the preceding core biopsy grading with and without neoadjuvant therapy, respectively. The results demonstrated a down-grading in cases post-neoadjuvant therapy. In cases who did not receive neoadjuvant therapy, underestimation of tumor grade in biopsy was evident. In summary, magnetic resonance imaging had the highest sensitivity, specificity, positive predictive value and accuracy of both diagnosis and estimation of tumor size. Mammography demonstrated better sensitivity than ultrasound and had the highest negative predictive value, but ultrasound had better specificity, positive predictive value and accuracy. Therefore, the combination of different modalities is advantageous. The concordance rate of core biopsy grading with excision was not impacted by neoadjuvant therapy.

Keywords: breast cancer, mammography, MRI, neoadjuvant, pathology, US

Procedia PDF Downloads 82
25077 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 313
25076 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial

Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs

Abstract:

Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.

Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation

Procedia PDF Downloads 124
25075 Drivers of Liking: Probiotic Petit Suisse Cheese

Authors: Helena Bolini, Erick Esmerino, Adriano Cruz, Juliana Paixao

Abstract:

The currently concern for health has increased demand for low-calorie ingredients and functional foods as probiotics. Understand the reasons that infer on food choice, besides a challenging task, it is important step for development and/or reformulation of existing food products. The use of appropriate multivariate statistical techniques, such as External Preference Map (PrefMap), associated with regression by Partial Least Squares (PLS) can help in determining those factors. Thus, this study aimed to determine, through PLS regression analysis, the sensory attributes considered drivers of liking in probiotic petit suisse cheeses, strawberry flavor, sweetened with different sweeteners. Five samples in same equivalent sweetness: PROB1 (Sucralose 0.0243%), PROB2 (Stevia 0.1520%), PROB3 (Aspartame 0.0877%), PROB4 (Neotame 0.0025%) and PROB5 (Sucrose 15.2%) determined by just-about-right and magnitude estimation methods, and three commercial samples COM1, COM2 and COM3, were studied. Analysis was done over data coming from QDA, performed by 12 expert (highly trained assessors) on 20 descriptor terms, correlated with data from assessment of overall liking in acceptance test, carried out by 125 consumers, on all samples. Sequentially, results were submitted to PLS regression using XLSTAT software from Byossistemes. As shown in results, it was possible determine, that three sensory descriptor terms might be considered drivers of liking of probiotic petit suisse cheese samples added with sweeteners (p<0.05). The milk flavor was noticed as a sensory characteristic with positive impact on acceptance, while descriptors bitter taste and sweet aftertaste were perceived as descriptor terms with negative impact on acceptance of petit suisse probiotic cheeses. It was possible conclude that PLS regression analysis is a practical and useful tool in determining drivers of liking of probiotic petit suisse cheeses sweetened with artificial and natural sweeteners, allowing food industry to understand and improve their formulations maximizing the acceptability of their products.

Keywords: acceptance, consumer, quantitative descriptive analysis, sweetener

Procedia PDF Downloads 446
25074 Distributional and Dynamic impact of Energy Subsidy Reform

Authors: Ali Hojati Najafabadi, Mohamad Hosein Rahmati, Seyed Ali Madanizadeh

Abstract:

Governments execute energy subsidy reforms by either increasing energy prices or reducing energy price dispersion. These policies make less use of energy per plant (intensive margin), vary the total number of firms (extensive margin), promote technological progress (technology channel), and make additional resources to redistribute (resource channel). We estimate a structural dynamic firm model with endogenous technology adaptation using data from the manufacturing firms in Iran and a country ranked the second-largest energy subsidy plan by the IMF. The findings show significant dynamics and distributional effects due to an energy reform plan. The price elasticity of energy consumption in the industrial sector is about -2.34, while it is -3.98 for large firms. The dispersion elasticity, defined as the amounts of changes in energy consumption by a one-percent reduction in the standard error of energy price distribution, is about 1.43, suggesting significant room for a distributional policy. We show that the intensive margin is the main driver of energy price elasticity, whereas the other channels mostly offset it. In contrast, the labor response is mainly through the extensive margin. Total factor productivity slightly improves in light of the reduction in energy consumption if, at the same time, the redistribution policy boosts the aggregate demands.

Keywords: energy reform, firm dynamics, structural estimation, subsidy policy

Procedia PDF Downloads 96
25073 Estimation of Carbon Sequestration and Air Quality of Terrestrial Ecosystems Using Remote Sensing Techniques

Authors: Kanwal Javid, Shazia Pervaiz, Maria Mumtaz, Muhammad Ameer Nawaz Akram

Abstract:

Forests and grasslands ecosystems play an important role in the global carbon cycle. Land management activities influence both ecosystems and enable them to absorb and sequester carbon dioxide (CO2). Similarly, in Pakistan, these terrestrial ecosystems are well known to mitigate carbon emissions and have a great source to supply a variety of services such as clean air and water, biodiversity, wood products, wildlife habitat, food, recreation and carbon sequestration. Carbon sequestration is the main agenda of developed and developing nations to reduce the impacts of global warming. But the amount of carbon storage within these ecosystems can be affected by many factors related to air quality such as land management, land-use change, deforestation, over grazing and natural calamities. Moreover, the long-term capacity of forests and grasslands to absorb and sequester CO2 depends on their health, productivity, resilience and ability to adapt to changing conditions. Thus, the main rationale of this study is to monitor the difference in carbon amount of forests and grasslands of Northern Pakistan using MODIS data sets and map results using Geographic Information System. Results of the study conclude that forests ecosystems are more effective in reducing the CO2 level and play a key role in improving the quality of air.

Keywords: carbon sequestration, grasslands, global warming, climate change.

Procedia PDF Downloads 188
25072 A Dynamic Cardiac Single Photon Emission Computer Tomography Using Conventional Gamma Camera to Estimate Coronary Flow Reserve

Authors: Maria Sciammarella, Uttam M. Shrestha, Youngho Seo, Grant T. Gullberg, Elias H. Botvinick

Abstract:

Background: Myocardial perfusion imaging (MPI) is typically performed with static imaging protocols and visually assessed for perfusion defects based on the relative intensity distribution. Dynamic cardiac SPECT, on the other hand, is a new imaging technique that is based on time varying information of radiotracer distribution, which permits quantification of myocardial blood flow (MBF). In this abstract, we report a progress and current status of dynamic cardiac SPECT using conventional gamma camera (Infinia Hawkeye 4, GE Healthcare) for estimation of myocardial blood flow and coronary flow reserve. Methods: A group of patients who had high risk of coronary artery disease was enrolled to evaluate our methodology. A low-dose/high-dose rest/pharmacologic-induced-stress protocol was implemented. A standard rest and a standard stress radionuclide dose of ⁹⁹ᵐTc-tetrofosmin (140 keV) was administered. The dynamic SPECT data for each patient were reconstructed using the standard 4-dimensional maximum likelihood expectation maximization (ML-EM) algorithm. Acquired data were used to estimate the myocardial blood flow (MBF). The correspondence between flow values in the main coronary vasculature with myocardial segments defined by the standardized myocardial segmentation and nomenclature were derived. The coronary flow reserve, CFR, was defined as the ratio of stress to rest MBF values. CFR values estimated with SPECT were also validated with dynamic PET. Results: The range of territorial MBF in LAD, RCA, and LCX was 0.44 ml/min/g to 3.81 ml/min/g. The MBF between estimated with PET and SPECT in the group of independent cohort of 7 patients showed statistically significant correlation, r = 0.71 (p < 0.001). But the corresponding CFR correlation was moderate r = 0.39 yet statistically significant (p = 0.037). The mean stress MBF value was significantly lower for angiographically abnormal than that for the normal (Normal Mean MBF = 2.49 ± 0.61, Abnormal Mean MBF = 1.43 ± 0. 0.62, P < .001). Conclusions: The visually assessed image findings in clinical SPECT are subjective, and may not reflect direct physiologic measures of coronary lesion. The MBF and CFR measured with dynamic SPECT are fully objective and available only with the data generated from the dynamic SPECT method. A quantitative approach such as measuring CFR using dynamic SPECT imaging is a better mode of diagnosing CAD than visual assessment of stress and rest images from static SPECT images Coronary Flow Reserve.

Keywords: dynamic SPECT, clinical SPECT/CT, selective coronary angiograph, ⁹⁹ᵐTc-Tetrofosmin

Procedia PDF Downloads 152
25071 Transient and Persistent Efficiency Estimation for Electric Grid Utilities Based on Meta-Frontier: Comparative Analysis of China and Japan

Authors: Bai-Chen Xie, Biao Li

Abstract:

With the deepening of international exchanges and investment, the international comparison of power grid firms has become the focus of regulatory authorities. Ignoring the differences in the economic environment, resource endowment, technology, and other aspects of different countries or regions may lead to efficiency bias. Based on the Meta-frontier model, this paper divides China and Japan into two groups by using the data of China and Japan from 2006 to 2020. While preserving the differences between the two countries, it analyzes and compares the efficiency of the transmission and distribution industries of the two countries. Combined with the four-component stochastic frontier model, the efficiency is divided into transient and persistent efficiency. We found that there are obvious differences between the transmission and distribution sectors in China and Japan. On the one hand, the inefficiency of the two countries is mostly caused by long-term and structural problems. The key to improve the efficiency of the two countries is to focus more on solving long-term and structural problems. On the other hand, the long-term and structural problems that cause the inefficiency of the two countries are not the same. Quality factors have different effects on the efficiency of the two countries, and this different effect is captured by the common frontier model but is offset in the overall model. Based on these findings, this paper proposes some targeted policy recommendations.

Keywords: transmission and distribution industries, transient efficiency, persistent efficiency, meta-frontier, international comparison

Procedia PDF Downloads 103
25070 Tuning of Kalman Filter Using Genetic Algorithm

Authors: Hesham Abdin, Mohamed Zakaria, Talaat Abd-Elmonaem, Alaa El-Din Sayed Hafez

Abstract:

Kalman filter algorithm is an estimator known as the workhorse of estimation. It has an important application in missile guidance, especially in lack of accurate data of the target due to noise or uncertainty. In this paper, a Kalman filter is used as a tracking filter in a simulated target-interceptor scenario with noise. It estimates the position, velocity, and acceleration of the target in the presence of noise. These estimations are needed for both proportional navigation and differential geometry guidance laws. A Kalman filter has a good performance at low noise, but a large noise causes considerable errors leads to performance degradation. Therefore, a new technique is required to overcome this defect using tuning factors to tune a Kalman filter to adapt increasing of noise. The values of the tuning factors are between 0.8 and 1.2, they have a specific value for the first half of range and a different value for the second half. they are multiplied by the estimated values. These factors have its optimum values and are altered with the change of the target heading. A genetic algorithm updates these selections to increase the maximum effective range which was previously reduced by noise. The results show that the selected factors have other benefits such as decreasing the minimum effective range that was increased earlier due to noise. In addition to, the selected factors decrease the miss distance for all ranges of this direction of the target, and expand the effective range which leads to increase probability of kill.

Keywords: proportional navigation, differential geometry, Kalman filter, genetic algorithm

Procedia PDF Downloads 512
25069 Evaluated Nuclear Data Based Photon Induced Nuclear Reaction Model of GEANT4

Authors: Jae Won Shin

Abstract:

We develop an evaluated nuclear data based photonuclear reaction model of GEANT4 for a more accurate simulation of photon-induced neutron production. The evaluated photonuclear data libraries from the ENDF/B-VII.1 are taken as input. Incident photon energies up to 140 MeV which is the threshold energy for the pion production are considered. For checking the validity of the use of the data-based model, we calculate the photoneutron production cross-sections and yields and compared them with experimental data. The results obtained from the developed model are found to be in good agreement with the experimental data for (γ,xn) reactions.

Keywords: ENDF/B-VII.1, GEANT4, photoneutron, photonuclear reaction

Procedia PDF Downloads 275
25068 LncRNA NEAT1 Promotes NSCLC Progression through Acting as a ceRNA of miR-377-3p

Authors: Chengcao Sun, Shujun Li, Cuili Yang, Yongyong Xi, Liang Wang, Feng Zhang, Dejia Li

Abstract:

Recently, the long non-coding RNA (lncRNA) NEAT1 has been identified as an oncogenic gene in multiple cancer types and elevated expression of NEAT1 was tightly linked to tumorigenesis and cancer progression. However, the molecular basis for this observation has not been characterized in progression of non-small cell lung cancer (NSCLC). In our studies, we identified NEAT1 was highly expressed in NSCLC patients and was a novel regulator of NSCLC progression. Patients whose tumors had high NEAT1 expression had a shorter overall survival than patients whose tumors had low NEAT1 expression. Further, NEAT1 significantly accelerates NSCLC cell growth and metastasis in vitro and tumor growth in vivo. Additionally, by using bioinformatics study and RNA pull down combined with luciferase reporter assays, we demonstrated that NEAT1 functioned as a competing endogenous RNA (ceRNA) for has-miR-377-3p, antagonized its functions and led to the de-repression of its endogenous targets E2F3, which was a core oncogene in promoting NSCLC progression. Taken together, these observations imply that the NEAT1 modulated the expression of E2F3 gene by acting as a competing endogenous RNA, which may build up the missing link between the regulatory miRNA network and NSCLC progression.

Keywords: long non-coding RNA NEAT1, hsa-miRNA-377-3p, E2F3, non-small cell lung cancer, tumorigenesis

Procedia PDF Downloads 370
25067 Estimation of Forces Applied to Forearm Using EMG Signal Features to Control of Powered Human Arm Prostheses

Authors: Faruk Ortes, Derya Karabulut, Yunus Ziya Arslan

Abstract:

Myoelectric features gathering from musculature environment are considered on a preferential basis to perceive muscle activation and control human arm prostheses according to recent experimental researches. EMG (electromyography) signal based human arm prostheses have shown a promising performance in terms of providing basic functional requirements of motions for the amputated people in recent years. However, these assistive devices for neurorehabilitation still have important limitations in enabling amputated people to perform rather sophisticated or functional movements. Surface electromyogram (EMG) is used as the control signal to command such devices. This kind of control consists of activating a motion in prosthetic arm using muscle activation for the same particular motion. Extraction of clear and certain neural information from EMG signals plays a major role especially in fine control of hand prosthesis movements. Many signal processing methods have been utilized for feature extraction from EMG signals. The specific objective of this study was to compare widely used time domain features of EMG signal including integrated EMG(IEMG), root mean square (RMS) and waveform length(WL) for prediction of externally applied forces to human hands. Obtained features were classified using artificial neural networks (ANN) to predict the forces. EMG signals supplied to process were recorded during only type of muscle contraction which is isometric and isotonic one. Experiments were performed by three healthy subjects who are right-handed and in a range of 25-35 year-old aging. EMG signals were collected from muscles of the proximal part of the upper body consisting of: biceps brachii, triceps brachii, pectorialis major and trapezius. The force prediction results obtained from the ANN were statistically analyzed and merits and pitfalls of the extracted features were discussed with detail. The obtained results are anticipated to contribute classification process of EMG signal and motion control of powered human arm prosthetics control.

Keywords: assistive devices for neurorehabilitation, electromyography, feature extraction, force estimation, human arm prosthesis

Procedia PDF Downloads 368
25066 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams

Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem

Abstract:

In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.

Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data

Procedia PDF Downloads 161
25065 Trajectory Tracking of a Redundant Hybrid Manipulator Using a Switching Control Method

Authors: Atilla Bayram

Abstract:

This paper presents the trajectory tracking control of a spatial redundant hybrid manipulator. This manipulator consists of two parallel manipulators which are a variable geometry truss (VGT) module. In fact, each VGT module with 3-degress of freedom (DOF) is a planar parallel manipulator and their operational planes of these VGT modules are arranged to be orthogonal to each other. Also, the manipulator contains a twist motion part attached to the top of the second VGT module to supply the missing orientation of the endeffector. These three modules constitute totally 7-DOF hybrid (parallel-parallel) redundant spatial manipulator. The forward kinematics equations of this manipulator are obtained, then, according to these equations, the inverse kinematics is solved based on an optimization with the joint limit avoidance. The dynamic equations are formed by using virtual work method. In order to test the performance of the redundant manipulator and the controllers presented, two different desired trajectories are followed by using the computed force control method and a switching control method. The switching control method is combined with the computed force control method and genetic algorithm. In the switching control method, the genetic algorithm is only used for fine tuning in the compensation of the trajectory tracking errors.

Keywords: computed force method, genetic algorithm, hybrid manipulator, inverse kinematics of redundant manipulators, variable geometry truss

Procedia PDF Downloads 349
25064 Automated Adaptions of Semantic User- and Service Profile Representations by Learning the User Context

Authors: Nicole Merkle, Stefan Zander

Abstract:

Ambient Assisted Living (AAL) describes a technological and methodological stack of (e.g. formal model-theoretic semantics, rule-based reasoning and machine learning), different aspects regarding the behavior, activities and characteristics of humans. Hence, a semantic representation of the user environment and its relevant elements are required in order to allow assistive agents to recognize situations and deduce appropriate actions. Furthermore, the user and his/her characteristics (e.g. physical, cognitive, preferences) need to be represented with a high degree of expressiveness in order to allow software agents a precise evaluation of the users’ context models. The correct interpretation of these context models highly depends on temporal, spatial circumstances as well as individual user preferences. In most AAL approaches, model representations of real world situations represent the current state of a universe of discourse at a given point in time by neglecting transitions between a set of states. However, the AAL domain currently lacks sufficient approaches that contemplate on the dynamic adaptions of context-related representations. Semantic representations of relevant real-world excerpts (e.g. user activities) help cognitive, rule-based agents to reason and make decisions in order to help users in appropriate tasks and situations. Furthermore, rules and reasoning on semantic models are not sufficient for handling uncertainty and fuzzy situations. A certain situation can require different (re-)actions in order to achieve the best results with respect to the user and his/her needs. But what is the best result? To answer this question, we need to consider that every smart agent requires to achieve an objective, but this objective is mostly defined by domain experts who can also fail in their estimation of what is desired by the user and what not. Hence, a smart agent has to be able to learn from context history data and estimate or predict what is most likely in certain contexts. Furthermore, different agents with contrary objectives can cause collisions as their actions influence the user’s context and constituting conditions in unintended or uncontrolled ways. We present an approach for dynamically updating a semantic model with respect to the current user context that allows flexibility of the software agents and enhances their conformance in order to improve the user experience. The presented approach adapts rules by learning sensor evidence and user actions using probabilistic reasoning approaches, based on given expert knowledge. The semantic domain model consists basically of device-, service- and user profile representations. In this paper, we present how this semantic domain model can be used in order to compute the probability of matching rules and actions. We apply this probability estimation to compare the current domain model representation with the computed one in order to adapt the formal semantic representation. Our approach aims at minimizing the likelihood of unintended interferences in order to eliminate conflicts and unpredictable side-effects by updating pre-defined expert knowledge according to the most probable context representation. This enables agents to adapt to dynamic changes in the environment which enhances the provision of adequate assistance and affects positively the user satisfaction.

Keywords: ambient intelligence, machine learning, semantic web, software agents

Procedia PDF Downloads 282
25063 A Selection Approach: Discriminative Model for Nominal Attributes-Based Distance Measures

Authors: Fang Gong

Abstract:

Distance measures are an indispensable part of many instance-based learning (IBL) and machine learning (ML) algorithms. The value difference metrics (VDM) and inverted specific-class distance measure (ISCDM) are among the top-performing distance measures that address nominal attributes. VDM performs well in some domains owing to its simplicity and poorly in others that exist missing value and non-class attribute noise. ISCDM, however, typically works better than VDM on such domains. To maximize their advantages and avoid disadvantages, in this paper, a selection approach: a discriminative model for nominal attributes-based distance measures is proposed. More concretely, VDM and ISCDM are built independently on a training dataset at the training stage, and the most credible one is recorded for each training instance. At the test stage, its nearest neighbor for each test instance is primarily found by any of VDM and ISCDM and then chooses the most reliable model of its nearest neighbor to predict its class label. It is simply denoted as a discriminative distance measure (DDM). Experiments are conducted on the 34 University of California at Irvine (UCI) machine learning repository datasets, and it shows DDM retains the interpretability and simplicity of VDM and ISCDM but significantly outperforms the original VDM and ISCDM and other state-of-the-art competitors in terms of accuracy.

Keywords: distance measure, discriminative model, nominal attributes, nearest neighbor

Procedia PDF Downloads 115
25062 Data Privacy: Stakeholders’ Conflicts in Medical Internet of Things

Authors: Benny Sand, Yotam Lurie, Shlomo Mark

Abstract:

Medical Internet of Things (MIoT), AI, and data privacy are linked forever in a gordian knot. This paper explores the conflicts of interests between the stakeholders regarding data privacy in the MIoT arena. While patients are at home during healthcare hospitalization, MIoT can play a significant role in improving the health of large parts of the population by providing medical teams with tools for collecting data, monitoring patients’ health parameters, and even enabling remote treatment. While the amount of data handled by MIoT devices grows exponentially, different stakeholders have conflicting understandings and concerns regarding this data. The findings of the research indicate that medical teams are not concerned by the violation of data privacy rights of the patients' in-home healthcare, while patients are more troubled and, in many cases, are unaware that their data is being used without their consent. MIoT technology is in its early phases, and hence a mixed qualitative and quantitative research approach will be used, which will include case studies and questionnaires in order to explore this issue and provide alternative solutions.

Keywords: MIoT, data privacy, stakeholders, home healthcare, information privacy, AI

Procedia PDF Downloads 102
25061 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 71
25060 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter

Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal

Abstract:

Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.

Keywords: air, component-specific toxicity, human health risks, particulate matter

Procedia PDF Downloads 311
25059 Comparative Analysis of Feature Extraction and Classification Techniques

Authors: R. L. Ujjwal, Abhishek Jain

Abstract:

In the field of computer vision, most facial variations such as identity, expression, emotions and gender have been extensively studied. Automatic age estimation has been rarely explored. With age progression of a human, the features of the face changes. This paper is providing a new comparable study of different type of algorithm to feature extraction [Hybrid features using HAAR cascade & HOG features] & classification [KNN & SVM] training dataset. By using these algorithms we are trying to find out one of the best classification algorithms. Same thing we have done on the feature selection part, we extract the feature by using HAAR cascade and HOG. This work will be done in context of age group classification model.

Keywords: computer vision, age group, face detection

Procedia PDF Downloads 370
25058 Pre-Operative Tool for Facial-Post-Surgical Estimation and Detection

Authors: Ayat E. Ali, Christeen R. Aziz, Merna A. Helmy, Mohammed M. Malek, Sherif H. El-Gohary

Abstract:

Goal: Purpose of the project was to make a plastic surgery prediction by using pre-operative images for the plastic surgeries’ patients and to show this prediction on a screen to compare between the current case and the appearance after the surgery. Methods: To this aim, we implemented a software which used data from the internet for facial skin diseases, skin burns, pre-and post-images for plastic surgeries then the post- surgical prediction is done by using K-nearest neighbor (KNN). So we designed and fabricated a smart mirror divided into two parts a screen and a reflective mirror so patient's pre- and post-appearance will be showed at the same time. Results: We worked on some skin diseases like vitiligo, skin burns and wrinkles. We classified the three degrees of burns using KNN classifier with accuracy 60%. We also succeeded in segmenting the area of vitiligo. Our future work will include working on more skin diseases, classify them and give a prediction for the look after the surgery. Also we will go deeper into facial deformities and plastic surgeries like nose reshaping and face slim down. Conclusion: Our project will give a prediction relates strongly to the real look after surgery and decrease different diagnoses among doctors. Significance: The mirror may have broad societal appeal as it will make the distance between patient's satisfaction and the medical standards smaller.

Keywords: k-nearest neighbor (knn), face detection, vitiligo, bone deformity

Procedia PDF Downloads 167
25057 Spatio-Temporal Pest Risk Analysis with ‘BioClass’

Authors: Vladimir A. Todiras

Abstract:

Spatio-temporal models provide new possibilities for real-time action in pest risk analysis. It should be noted that estimation of the possibility and probability of introduction of a pest and of its economic consequences involves many uncertainties. We present a new mapping technique that assesses pest invasion risk using online BioClass software. BioClass is a GIS tool designed to solve multiple-criteria classification and optimization problems based on fuzzy logic and level set methods. This research describes a method for predicting the potential establishment and spread of a plant pest into new areas using a case study: corn rootworm (Diabrotica spp.), tomato leaf miner (Tuta absoluta) and plum fruit moth (Grapholita funebrana). Our study demonstrated that in BioClass we can combine fuzzy logic and geographic information systems with knowledge of pest biology and environmental data to derive new information for decision making. Pests are sensitive to a warming climate, as temperature greatly affects their survival and reproductive rate and capacity. Changes have been observed in the distribution, frequency and severity of outbreaks of Helicoverpa armigera on tomato. BioClass has demonstrated to be a powerful tool for applying dynamic models and map the potential future distribution of a species, enable resource to make decisions about dangerous and invasive species management and control.

Keywords: classification, model, pest, risk

Procedia PDF Downloads 282
25056 Long Run Estimates of Population, Consumption and Economic Development of India: An ARDL Bounds Testing Approach of Cointegration

Authors: Sanjay Kumar, Arumugam Sankaran, Arjun K., Mousumi Das

Abstract:

The amount of domestic consumption and population growth is having a positive impact on economic growth and development as observed by the Harrod-Domar and endogenous growth models. The paper negates the Solow growth model which argues the population growth has a detrimental impact on per capita and steady-state growth. Unlike the Solow model, the paper observes, the per capita income growth never falls zero, and it sustains as positive. Hence, our goal here is to investigate the relationship among population, domestic consumption and economic growth of India. For this estimation, annual data from 1980-2016 has been collected from World Development Indicator and Reserve Bank of India. To know the long run as well as short-run dynamics among the variables, we have employed the ARDL bounds testing approach of cointegration followed by modified Wald causality test to know the direction of causality. The conclusion from cointegration and ARDL estimates reveal that there is a long run positive and statistically significant relationship among the variables under study. At the same time, the causality test shows that there is a causal relationship that exists among the variables. Hence, this calls for policies which have a long run perspective in strengthening the capabilities and entitlements of people and stabilizing domestic demand so as to serve long run and short run growth and stability of the economy.

Keywords: cointegration, consumption, economic development, population growth

Procedia PDF Downloads 160
25055 Big Data Strategy for Telco: Network Transformation

Authors: F. Amin, S. Feizi

Abstract:

Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and next-generation network, however, are more exorbitant than improved customer relationship management. Next generation of networks are in a prime position to monetize rich supplies of customer information—while being mindful of legal and privacy issues. As data assets are transformed into new revenue streams will become integral to high performance.

Keywords: big data, next generation networks, network transformation, strategy

Procedia PDF Downloads 361
25054 REDUCER: An Architectural Design Pattern for Reducing Large and Noisy Data Sets

Authors: Apkar Salatian

Abstract:

To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article, we also show how REDUCER has successfully been applied to 3 different case studies.

Keywords: design pattern, filtering, compression, architectural design

Procedia PDF Downloads 213
25053 Fuzzy Expert Systems Applied to Intelligent Design of Data Centers

Authors: Mario M. Figueroa de la Cruz, Claudia I. Solorzano, Raul Acosta, Ignacio Funes

Abstract:

This technological development project seeks to create a tool that allows companies, in need of implementing a Data Center, intelligently determining factors for allocating resources support cooling and power supply (UPS) in its conception. The results should show clearly the speed, robustness and reliability of a system designed for deployment in environments where they must manage and protect large volumes of data.

Keywords: telecommunications, data center, fuzzy logic, expert systems

Procedia PDF Downloads 345
25052 A Goal-Oriented Approach for Supporting Input/Output Factor Determination in the Regulation of Brazilian Electricity Transmission

Authors: Bruno de Almeida Vilela, Heinz Ahn, Ana Lúcia Miranda Lopes, Marcelo Azevedo Costa

Abstract:

Benchmarking public utilities such as transmission system operators (TSOs) is one of the main strategies employed by regulators in order to fix monopolistic companies’ revenues. Since 2007 the Brazilian regulator has been utilizing Data Envelopment Analysis (DEA) to benchmark TSOs. Despite the application of DEA to improve the transmission sector’s efficiency, some problems can be pointed out, such as the high price of electricity in Brazil; the limitation of the benchmarking only to operational expenses (OPEX); the absence of variables that represent the outcomes of the transmission service; and the presence of extremely low and high efficiencies. As an alternative to the current concept of benchmarking the Brazilian regulator uses, we propose a goal-oriented approach. Our proposal supports input/output selection by taking traditional organizational goals and measures as a basis for the selection of factors for benchmarking purposes. As the main advantage, it resolves the classical DEA problems of input/output selection, undesirable and dual-role factors. We also provide a demonstration of our goal-oriented concept regarding service quality. As a result, most TSOs’ efficiencies in Brazil might improve when considering quality as important in their efficiency estimation.

Keywords: decision making, goal-oriented benchmarking, input/output factor determination, TSO regulation

Procedia PDF Downloads 198