Search results for: improvement of model accuracy and reliability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22820

Search results for: improvement of model accuracy and reliability

22610 Improving Similarity Search Using Clustered Data

Authors: Deokho Kim, Wonwoo Lee, Jaewoong Lee, Teresa Ng, Gun-Ill Lee, Jiwon Jeong

Abstract:

This paper presents a method for improving object search accuracy using a deep learning model. A major limitation to provide accurate similarity with deep learning is the requirement of huge amount of data for training pairwise similarity scores (metrics), which is impractical to collect. Thus, similarity scores are usually trained with a relatively small dataset, which comes from a different domain, causing limited accuracy on measuring similarity. For this reason, this paper proposes a deep learning model that can be trained with a significantly small amount of data, a clustered data which of each cluster contains a set of visually similar images. In order to measure similarity distance with the proposed method, visual features of two images are extracted from intermediate layers of a convolutional neural network with various pooling methods, and the network is trained with pairwise similarity scores which is defined zero for images in identical cluster. The proposed method outperforms the state-of-the-art object similarity scoring techniques on evaluation for finding exact items. The proposed method achieves 86.5% of accuracy compared to the accuracy of the state-of-the-art technique, which is 59.9%. That is, an exact item can be found among four retrieved images with an accuracy of 86.5%, and the rest can possibly be similar products more than the accuracy. Therefore, the proposed method can greatly reduce the amount of training data with an order of magnitude as well as providing a reliable similarity metric.

Keywords: visual search, deep learning, convolutional neural network, machine learning

Procedia PDF Downloads 189
22609 Epileptic Seizure Prediction by Exploiting Signal Transitions Phenomena

Authors: Mohammad Zavid Parvez, Manoranjan Paul

Abstract:

A seizure prediction method is proposed by extracting global features using phase correlation between adjacent epochs for detecting relative changes and local features using fluctuation/deviation within an epoch for determining fine changes of different EEG signals. A classifier and a regularization technique are applied for the reduction of false alarms and improvement of the overall prediction accuracy. The experiments show that the proposed method outperforms the state-of-the-art methods and provides high prediction accuracy (i.e., 97.70%) with low false alarm using EEG signals in different brain locations from a benchmark data set.

Keywords: Epilepsy, seizure, phase correlation, fluctuation, deviation.

Procedia PDF Downloads 442
22608 Reliability of Cores Test Result at Elevated Temperature in Case of High Strength Concrete (HSC)

Authors: Waqas Ali

Abstract:

Concrete is broadly used as a structural material in the construction of buildings. When the concrete is exposed to elevated temperature, its strength evaluation is very necessary in the existing structure. In this study, the effect of temperature and the reliability of the core test has been evaluated. For this purpose, the cylindrical cores were extracted from High strength concrete (HSC) specimens that were exposed to the temperature ranging from 300 ℃ to 900 ℃ with a constant duration of 4 hr. This study compares the difference between the standard heated cylinders and the cores taken from them after curing of 90 days. The difference of cylindrical control and binary mix samples and extracted cores revealed that there is 12.19 and 12.38% difference at 300℃, while this difference was found to increase up to 12.89%, 13.03% at 500 ℃. Furthermore, this value is recorded as 12.99%, 13.57% and 14.40%, 14.38% at 700 ℃ and 900 ℃, respectively. A total of four equations were developed through a regression model for the prediction of the strength of concrete for both standard cylinders and extracted cores whose R square values were 0.9733, 0.9627 and 0.9473, 0.9452, respectively.

Keywords: high strength, temperature, core, reliability

Procedia PDF Downloads 43
22607 ELD79-LGD2006 Transformation Techniques Implementation and Accuracy Comparison in Tripoli Area, Libya

Authors: Jamal A. Gledan, Othman A. Azzeidani

Abstract:

During the last decade, Libya established a new Geodetic Datum called Libyan Geodetic Datum 2006 (LGD 2006) by using GPS, whereas the ground traversing method was used to establish the last Libyan datum which was called the Europe Libyan Datum 79 (ELD79). The current research paper introduces ELD79 to LGD2006 coordinate transformation technique, the accurate comparison of transformation between multiple regression equations and the three-parameters model (Bursa-Wolf). The results had been obtained show that the overall accuracy of stepwise multi regression equations is better than that can be determined by using Bursa-Wolf transformation model.

Keywords: geodetic datum, horizontal control points, traditional similarity transformation model, unconventional transformation techniques

Procedia PDF Downloads 275
22606 Accuracy of a 3D-Printed Polymer Model for Producing Casting Mold

Authors: Ariangelo Hauer Dias Filho, Gustavo Antoniácomi de Carvalho, Benjamim de Melo Carvalho

Abstract:

The work´s purpose was to evaluate the possibility of manufacturing casting tools utilizing Fused Filament Fabrication, a 3D printing technique, without any post-processing on the printed part. Taguchi Orthogonal array was used to evaluate the influence of extrusion temperature, bed temperature, layer height, and infill on the dimensional accuracy of a 3D-Printed Polymer Model. A Zeiss T-SCAN CS 3D Scanner was used for dimensional evaluation of the printed parts within the limit of ±0,2 mm. The mold capabilities were tested with the printed model to check how it would interact with the green sand. With little adjustments in the 3D model, it was possible to produce rapid tools without the need for post-processing for iron casting. The results are important for reducing time and cost in the development of such tools.

Keywords: additive manufacturing, Taguchi method, rapid tooling, fused filament fabrication, casting mold

Procedia PDF Downloads 99
22605 A Dose Distribution Approach Using Monte Carlo Simulation in Dosimetric Accuracy Calculation for Treating the Lung Tumor

Authors: Md Abdullah Al Mashud, M. Tariquzzaman, M. Jahangir Alam, Tapan Kumar Godder, M. Mahbubur Rahman

Abstract:

This paper presents a Monte Carlo (MC) method-based dose distributions on lung tumor for 6 MV photon beam to improve the dosimetric accuracy for cancer treatment. The polystyrene which is tissue equivalent material to the lung tumor density is used in this research. In the empirical calculations, TRS-398 formalism of IAEA has been used, and the setup was made according to the ICRU recommendations. The research outcomes were compared with the state-of-the-art experimental results. From the experimental results, it is observed that the proposed based approach provides more accurate results and improves the accuracy than the existing approaches. The average %variation between measured and TPS simulated values was obtained 1.337±0.531, which shows a substantial improvement comparing with the state-of-the-art technology.

Keywords: lung tumour, Monte Carlo, polystyrene, Elekta synergy, Monaco planning system

Procedia PDF Downloads 412
22604 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 118
22603 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran

Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh

Abstract:

Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.

Keywords: Malmquist Index, Grey's Theory, CCR Model, network data envelopment analysis, Iran electricity power chain

Procedia PDF Downloads 123
22602 Reliability Modeling on Drivers’ Decision during Yellow Phase

Authors: Sabyasachi Biswas, Indrajit Ghosh

Abstract:

The random and heterogeneous behavior of vehicles in India puts up a greater challenge for researchers. Stop-and-go modeling at signalized intersections under heterogeneous traffic conditions has remained one of the most sought-after fields. Vehicles are often caught up in the dilemma zone and are unable to take quick decisions whether to stop or cross the intersection. This hampers the traffic movement and may lead to accidents. The purpose of this work is to develop a stop and go prediction model that depicts the drivers’ decision during the yellow time at signalised intersections. To accomplish this, certain traffic parameters were taken into account to develop surrogate model. This research investigated the Stop and Go behavior of the drivers by collecting data from 4-signalized intersections located in two major Indian cities. Model was developed to predict the drivers’ decision making during the yellow phase of the traffic signal. The parameters used for modeling included distance to stop line, time to stop line, speed, and length of the vehicle. A Kriging base surrogate model has been developed to investigate the drivers’ decision-making behavior in amber phase. It is observed that the proposed approach yields a highly accurate result (97.4 percent) by Gaussian function. It was observed that the accuracy for the crossing probability was 95.45, 90.9 and 86.36.11 percent respectively as predicted by the Kriging models with Gaussian, Exponential and Linear functions.

Keywords: decision-making decision, dilemma zone, surrogate model, Kriging

Procedia PDF Downloads 285
22601 Influence of Measurement System on Negative Bias Temperature Instability Characterization: Fast BTI vs Conventional BTI vs Fast Wafer Level Reliability

Authors: Vincent King Soon Wong, Hong Seng Ng, Florinna Sim

Abstract:

Negative Bias Temperature Instability (NBTI) is one of the critical degradation mechanisms in semiconductor device reliability that causes shift in the threshold voltage (Vth). However, thorough understanding of this reliability failure mechanism is still unachievable due to a recovery characteristic known as NBTI recovery. This paper will demonstrate the severity of NBTI recovery as well as one of the effective methods used to mitigate, which is the minimization of measurement system delays. Comparison was done in between two measurement systems that have significant differences in measurement delays to show how NBTI recovery causes result deviations and how fast measurement systems can mitigate NBTI recovery. Another method to minimize NBTI recovery without the influence of measurement system known as Fast Wafer Level Reliability (FWLR) NBTI was also done to be used as reference.

Keywords: fast vs slow BTI, fast wafer level reliability (FWLR), negative bias temperature instability (NBTI), NBTI measurement system, metal-oxide-semiconductor field-effect transistor (MOSFET), NBTI recovery, reliability

Procedia PDF Downloads 384
22600 Improvement of the Reliability and the Availability of a Production System

Authors: Lakhoua Najeh

Abstract:

Aims of the work: The aim of this paper is to improve the reliability and the availability of a Packer production line of cigarettes based on two methods: The SADT method (Structured Analysis Design Technique) and the FMECA approach (Failure Mode Effects and Critically Analysis). The first method enables us to describe the functionality of the Packer production line of cigarettes and the second method enables us to establish an FMECA analysis. Methods: The methodology adopted in order to contribute to the improvement of the reliability and the availability of a Packer production line of cigarettes has been proposed in this paper, and it is based on the use of Structured Analysis Design Technique (SADT) and Failure mode, effects, and criticality analysis (FMECA) methods. This methodology consists of using a diagnosis of the existing of all of the equipment of a production line of a factory in order to determine the most critical machine. In fact, we use, on the one hand, a functional analysis based on the SADT method of the production line and on the other hand, a diagnosis and classification of mechanical and electrical failures of the line production by their criticality analysis based on the FMECA approach. Results: Based on the methodology adopted in this paper, the results are the creation and the launch of a preventive maintenance plan. They contain the different elements of a Packer production line of cigarettes; the list of the intervention preventive activities and their period of realization. Conclusion: The diagnosis of the existing state helped us to found that the machine of cigarettes used in the Packer production line of cigarettes is the most critical machine in the factory. Then this enables us in the one hand, to describe the functionality of the production line of cigarettes by SADT method and on the other hand, to study the FMECA machine in order to improve the availability and the performance of this machine.

Keywords: production system, diagnosis, SADT method, FMECA method

Procedia PDF Downloads 118
22599 Application of Data Mining Techniques for Tourism Knowledge Discovery

Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee

Abstract:

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

Keywords: classification algorithms, data mining, knowledge discovery, tourism

Procedia PDF Downloads 268
22598 Fast Adjustable Threshold for Uniform Neural Network Quantization

Authors: Alexander Goncharenko, Andrey Denisov, Sergey Alyamkin, Evgeny Terentev

Abstract:

The neural network quantization is highly desired procedure to perform before running neural networks on mobile devices. Quantization without fine-tuning leads to accuracy drop of the model, whereas commonly used training with quantization is done on the full set of the labeled data and therefore is both time- and resource-consuming. Real life applications require simplification and acceleration of quantization procedure that will maintain accuracy of full-precision neural network, especially for modern mobile neural network architectures like Mobilenet-v1, MobileNet-v2 and MNAS. Here we present a method to significantly optimize training with quantization procedure by introducing the trained scale factors for discretization thresholds that are separate for each filter. Using the proposed technique, we quantize the modern mobile architectures of neural networks with the set of train data of only ∼ 10% of the total ImageNet 2012 sample. Such reduction of train dataset size and small number of trainable parameters allow to fine-tune the network for several hours while maintaining the high accuracy of quantized model (accuracy drop was less than 0.5%). Ready-for-use models and code are available in the GitHub repository.

Keywords: distillation, machine learning, neural networks, quantization

Procedia PDF Downloads 291
22597 Performance Assessment of Three Unit Redundant System with Environmental and Human Failure Using Copula Approach

Authors: V. V. Singh

Abstract:

We have studied the reliability measures of a system, which consists of two subsystems i.e. subsystem-1 and subsystem-2 in series configuration under different types of failure. The subsystem-1 has three identical units in parallel configuration and operating under 2-out-of-3: G policy and connected to subsystem-2 in series configuration. Each subsystem has different types of failure and repair rates. An important cause for failure of system is unsuitability of the environmental conditions, like overheating, weather conditions, heavy rainfall, storm etc. The environmental failure is taken into account in the proposed repairable system. Supplementary variable technique is used to study of system and some traditional measures such as; availability, reliability, MTTF and profit function are obtained for different values of parameters. In the proposed model, some particular cases of failure rates are explicitly studied.

Keywords: environmental failure, human failure, availability, MTTF, reliability, profit analysis, Gumbel-Hougaard family copula

Procedia PDF Downloads 323
22596 Reliability Assessment for Tie Line Capacity Assistance of Power Systems Based on Multi-Agent System

Authors: Nadheer A. Shalash, Abu Zaharin Bin Ahmad

Abstract:

Technological developments in industrial innovations have currently been related to interconnected system assistance and distribution networks. This important in order to enable an electrical load to continue receive power in the event of disconnection of load from the main power grid. This paper represents a method for reliability assessment of interconnected power systems based. The multi-agent system consists of four agents. The first agent was the generator agent to using as connected the generator to the grid depending on the state of the reserve margin and the load demand. The second was a load agent is that located at the load. Meanwhile, the third is so-called "the reverse margin agent" that to limit the reserve margin between 0-25% depend on the load and the unit size generator. In the end, calculation reliability Agent can be calculate expected energy not supplied (EENS), loss of load expectation (LOLE) and the effecting of tie line capacity to determine the risk levels Roy Billinton Test System (RBTS) can use to evaluated the reliability indices by using the developed JADE package. The results estimated of the reliability interconnection power systems presented in this paper. The overall reliability of power system can be improved. Thus, the market becomes more concentrated against demand increasing and the generation units were operating in relation to reliability indices.

Keywords: reliability indices, load expectation, reserve margin, daily load, probability, multi-agent system

Procedia PDF Downloads 296
22595 Binary Decision Diagram Based Methods to Evaluate the Reliability of Systems Considering Failure Dependencies

Authors: Siqi Qiu, Yijian Zheng, Xin Guo Ming

Abstract:

In many reliability and risk analysis, failures of components are supposed to be independent. However, in reality, the ignorance of failure dependencies among components may render the results of reliability and risk analysis incorrect. There are two principal ways to incorporate failure dependencies in system reliability and risk analysis: implicit and explicit methods. In the implicit method, failure dependencies can be modeled by joint probabilities, correlation values or conditional probabilities. In the explicit method, certain types of dependencies can be modeled in a fault tree as mutually independent basic events for specific component failures. In this paper, explicit and implicit methods based on BDD will be proposed to evaluate the reliability of systems considering failure dependencies. The obtained results prove the equivalence of the proposed implicit and explicit methods. It is found that the consideration of failure dependencies decreases the reliability of systems. This observation is intuitive, because more components fail due to failure dependencies. The consideration of failure dependencies helps designers to reduce the dependencies between components during the design phase to make the system more reliable.

Keywords: reliability assessment, risk assessment, failure dependencies, binary decision diagram

Procedia PDF Downloads 447
22594 Utility Assessment Model for Wireless Technology in Construction

Authors: Yassir AbdelRazig, Amine Ghanem

Abstract:

Construction projects are information intensive in nature and involve many activities that are related to each other. Wireless technologies can be used to improve the accuracy and timeliness of data collected from construction sites and shares it with appropriate parties. Nonetheless, the construction industry tends to be conservative and shows hesitation to adopt new technologies. A main concern for owners, contractors or any person in charge on a job site is the cost of the technology in question. Wireless technologies are not cheap. There are a lot of expenses to be taken into consideration, and a study should be completed to make sure that the importance and savings resulting from the usage of this technology is worth the expenses. This research attempts to assess the effectiveness of using the appropriate wireless technologies based on criteria such as performance, reliability, and risk. The assessment is based on a utility function model that breaks down the selection issue into alternatives attribute. Then the attributes are assigned weights and single attributes are measured. Finally, single attribute are combined to develop one single aggregate utility index for each alternative.

Keywords: analytic hierarchy process, decision theory, utility function, wireless technologies

Procedia PDF Downloads 313
22593 A Study of ZY3 Satellite Digital Elevation Model Verification and Refinement with Shuttle Radar Topography Mission

Authors: Bo Wang

Abstract:

As the first high-resolution civil optical satellite, ZY-3 satellite is able to obtain high-resolution multi-view images with three linear array sensors. The images can be used to generate Digital Elevation Models (DEM) through dense matching of stereo images. However, due to the clouds, forest, water and buildings covered on the images, there are some problems in the dense matching results such as outliers and areas failed to be matched (matching holes). This paper introduced an algorithm to verify the accuracy of DEM that generated by ZY-3 satellite with Shuttle Radar Topography Mission (SRTM). Since the accuracy of SRTM (Internal accuracy: 5 m; External accuracy: 15 m) is relatively uniform in the worldwide, it may be used to improve the accuracy of ZY-3 DEM. Based on the analysis of mass DEM and SRTM data, the processing can be divided into two aspects. The registration of ZY-3 DEM and SRTM can be firstly performed using the conjugate line features and area features matched between these two datasets. Then the ZY-3 DEM can be refined by eliminating the matching outliers and filling the matching holes. The matching outliers can be eliminated based on the statistics on Local Vector Binning (LVB). The matching holes can be filled by the elevation interpolated from SRTM. Some works are also conducted for the accuracy statistics of the ZY-3 DEM.

Keywords: ZY-3 satellite imagery, DEM, SRTM, refinement

Procedia PDF Downloads 316
22592 Reliability-Based Method for Assessing Liquefaction Potential of Soils

Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty

Abstract:

This paper explores probabilistic method for assessing the liquefaction potential of sandy soils. The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to determine whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. A solution to this problem can be found by reliability analysis.This paper presents a reliability analysis method based on the popular certain liquefaction analysis method. The proposed probabilistic method is formulated based on the results of reliability analyses of 190 field records and observations of soil performance against liquefaction. The results of the present study show that confidence coefficient greater and smaller than 1 does not mean safety and/or liquefaction in cadence for liquefaction, and for assuring liquefaction probability, reliability based method analysis should be used. This reliability method uses the empirical acceleration attenuation law in the Chalos area to derive the probability density distribution function and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The CSR and CRR statistics are used in continuity with the first order and second moment method to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.

Keywords: liquefaction, reliability analysis, chalos area, civil and structural engineering

Procedia PDF Downloads 441
22591 Reliability Based Topology Optimization: An Efficient Method for Material Uncertainty

Authors: Mehdi Jalalpour, Mazdak Tootkaboni

Abstract:

We present a computationally efficient method for reliability-based topology optimization under material properties uncertainty, which is assumed to be lognormally distributed and correlated within the domain. Computational efficiency is achieved through estimating the response statistics with stochastic perturbation of second order, using these statistics to fit an appropriate distribution that follows the empirical distribution of the response, and employing an efficient gradient-based optimizer. The proposed algorithm is utilized for design of new structures and the changes in the optimized topology is discussed for various levels of target reliability and correlation strength. Predictions were verified thorough comparison with results obtained using Monte Carlo simulation.

Keywords: material uncertainty, stochastic perturbation, structural reliability, topology optimization

Procedia PDF Downloads 572
22590 Reliability Analysis of Soil Liquefaction Based on Standard Penetration: A Case Study in Babol City

Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty

Abstract:

There are more probabilistic and deterministic liquefaction evaluation procedures in order to judge whether liquefaction will occur or not. A review of this approach reveals that there is a need for a comprehensive procedure that accounts for different sources of uncertainty in liquefaction evaluation. In fact, for the same set of input parameters, different methods provide different factors of safety and/or probabilities of liquefaction. To account for the different uncertainties, including both the model and measurement uncertainties, reliability analysis is necessary. This paper has obtained information from Standard Penetration Test (SPT) and some empirical approaches such as: Seed et al, Highway bridge of Japan approach to soil liquefaction, The Overseas Coastal Area Development Institute of Japan (OCDI) and reliability method to studying potential of liquefaction in soil of Babol city in the north of Iran are compared. Evaluation potential of liquefaction in soil of Babol city is an important issue since the soil of some area contains sand, seismic area, increasing level of underground waters and consequently saturation of soil; therefore, one of the most important goals of this paper is to gain suitable recognition of liquefaction potential and find the most appropriate procedure of evaluation liquefaction potential to decrease related damages.

Keywords: reliability analysis, liquefaction, Babol, civil, construction and geological engineering

Procedia PDF Downloads 480
22589 Development and Evaluation of a Psychological Adjustment and Adaptation Status Scale for Breast Cancer Survivors

Authors: Jing Chen, Jun-E Liu, Peng Yue

Abstract:

Objective: The objective of this study was to develop a psychological adjustment and adaptation status scale for breast cancer survivors, and to examine the reliability and validity of the scale. Method: 37 breast cancer survivors were recruited in qualitative research; a five-subject theoretical framework and an item pool of 150 items of the scale were derived from the interview data. In order to evaluate and select items and reach a preliminary validity and reliability for the original scale, the suggestions of study group members, experts and breast cancer survivors were taken, and statistical methods were used step by step in a sample of 457 breast cancer survivors. Results: An original 24-item scale was developed. The five dimensions “domestic affections”, “interpersonal relationship”, “attitude of life”, “health awareness”, “self-control/self-efficacy” explained 58.053% of the total variance. The content validity was assessed by experts, the CVI was 0.92. The construct validity was examined in a sample of 264 breast cancer survivors. The fitting indexes of confirmatory factor analysis (CFA) showed good fitting of the five dimensions model. The criterion-related validity of the total scale with PTGI was satisfactory (r=0.564, p<0.001). The internal consistency reliability and test-retest reliability were tested. Cronbach’s alpha value (0.911) showed a good internal consistency reliability, and the intraclass correlation coefficient (ICC=0.925, p<0.001) showed a satisfactory test-retest reliability. Conclusions: The scale was brief and easy to understand, was suitable for breast cancer patients whose physical strength and energy were limited.

Keywords: breast cancer survivors, rehabilitation, psychological adaption and adjustment, development of scale

Procedia PDF Downloads 491
22588 Radial Distribution Network Reliability Improvement by Using Imperialist Competitive Algorithm

Authors: Azim Khodadadi, Sahar Sadaat Vakili, Ebrahim Babaei

Abstract:

This study presents a numerical method to optimize the failure rate and repair time of a typical radial distribution system. Failure rate and repair time are effective parameters in customer and energy based indices of reliability. Decrease of these parameters improves reliability indices. Thus, system stability will be boost. The penalty functions indirectly reflect the cost of investment which spent to improve these indices. Constraints on customer and energy based indices, i.e. SAIFI, SAIDI, CAIDI and AENS have been considered by using a new method which reduces optimization algorithm controlling parameters. Imperialist Competitive Algorithm (ICA) used as main optimization technique and particle swarm optimization (PSO), simulated annealing (SA) and differential evolution (DE) has been applied for further investigation. These algorithms have been implemented on a test system by MATLAB. Obtained results have been compared with each other. The optimized values of repair time and failure rate are much lower than current values which this achievement reduced investment cost and also ICA gives better answer than the other used algorithms.

Keywords: imperialist competitive algorithm, failure rate, repair time, radial distribution network

Procedia PDF Downloads 629
22587 Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models

Authors: Morten Brøgger, Kim Wittchen

Abstract:

Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.

Keywords: building stock energy modelling, energy-savings, archetype

Procedia PDF Downloads 130
22586 Diabetes Diagnosis Model Using Rough Set and K- Nearest Neighbor Classifier

Authors: Usiobaifo Agharese Rosemary, Osaseri Roseline Oghogho

Abstract:

Diabetes is a complex group of disease with a variety of causes; it is a disorder of the body metabolism in the digestion of carbohydrates food. The application of machine learning in the field of medical diagnosis has been the focus of many researchers and the use of recognition and classification model as a decision support tools has help the medical expert in diagnosis of diseases. Considering the large volume of medical data which require special techniques, experience, and high diagnostic skill in the diagnosis of diseases, the application of an artificial intelligent system to assist medical personnel in order to enhance their efficiency and accuracy in diagnosis will be an invaluable tool. In this study will propose a diabetes diagnosis model using rough set and K-nearest Neighbor classifier algorithm. The system consists of two modules: the feature extraction module and predictor module, rough data set is used to preprocess the attributes while K-nearest neighbor classifier is used to classify the given data. The dataset used for this model was taken for University of Benin Teaching Hospital (UBTH) database. Half of the data was used in the training while the other half was used in testing the system. The proposed model was able to achieve over 80% accuracy.

Keywords: classifier algorithm, diabetes, diagnostic model, machine learning

Procedia PDF Downloads 307
22585 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 527
22584 Oil Producing Wells Using a Technique of Gas Lift on Prosper Software

Authors: Nikhil Yadav, Shubham Verma

Abstract:

Gas lift is a common technique used to optimize oil production in wells. Prosper software is a powerful tool for modeling and optimizing gas lift systems in oil wells. This review paper examines the effectiveness of Prosper software in optimizing gas lift systems in oil-producing wells. The literature review identified several studies that demonstrated the use of Prosper software to adjust injection rate, depth, and valve characteristics to optimize gas lift system performance. The results showed that Prosper software can significantly improve production rates and reduce operating costs in oil-producing wells. However, the accuracy of the model depends on the accuracy of the input data, and the cost of Prosper software can be high. Therefore, further research is needed to improve the accuracy of the model and evaluate the cost-effectiveness of using Prosper software in gas lift system optimization

Keywords: gas lift, prosper software, injection rate, operating costs, oil-producing wells

Procedia PDF Downloads 48
22583 Predictive Analytics in Traffic Flow Management: Integrating Temporal Dynamics and Traffic Characteristics to Estimate Travel Time

Authors: Maria Ezziani, Rabie Zine, Amine Amar, Ilhame Kissani

Abstract:

This paper introduces a predictive model for urban transportation engineering, which is vital for efficient traffic management. Utilizing comprehensive datasets and advanced statistical techniques, the model accurately forecasts travel times by considering temporal variations and traffic dynamics. Machine learning algorithms, including regression trees and neural networks, are employed to capture sequential dependencies. Results indicate significant improvements in predictive accuracy, particularly during peak hours and holidays, with the incorporation of traffic flow and speed variables. Future enhancements may integrate weather conditions and traffic incidents. The model's applications range from adaptive traffic management systems to route optimization algorithms, facilitating congestion reduction and enhancing journey reliability. Overall, this research extends beyond travel time estimation, offering insights into broader transportation planning and policy-making realms, empowering stakeholders to optimize infrastructure utilization and improve network efficiency.

Keywords: predictive analytics, traffic flow, travel time estimation, urban transportation, machine learning, traffic management

Procedia PDF Downloads 33
22582 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm

Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu

Abstract:

Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.

Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model

Procedia PDF Downloads 220
22581 Reliability Analysis: A Case Study in Designing Power Distribution System of Tehran Oil Refinery

Authors: A. B. Arani, R. Shojaee

Abstract:

Electrical power distribution system is one of the vital infrastructures of an oil refinery, which requires wide area of study and planning before construction. In this paper, power distribution reliability of Tehran Refinery’s KHDS/GHDS unit has been taken into consideration to investigate the importance of these kinds of studies and evaluate the designed system. In this regard, the authors chose and evaluated different configurations of electrical power distribution along with the existing configuration with the aim of finding the most suited configuration which satisfies the conditions of minimum cost of electrical system construction, minimum cost imposed by loss of load, and maximum power system reliability.

Keywords: power distribution system, oil refinery, reliability, investment cost, interruption cost

Procedia PDF Downloads 843