Search results for: weighted interval
1279 Physico-Chemical Properties of Silurian Hot Shale in Ahnet Basin, Algeria: Case Study Well ASS-1
Authors: Mohamed Mehdi Kadri
Abstract:
The prediction of hot shale interval in Silurian formation in a well drilled vertically in Ahnet basin Is by logging Data (Resistivity, Gamma Ray, Sonic) with the calculation of total organic carbon (TOC) using ∆ log R Method. The aim of this paper is to present Physico-chemical Properties of Hot Shale using IR spectroscopy and gas chromatography-mass spectrometry analysis; this mixture of measurements, evaluation and characterization show that the hot shale interval located in the lower of Silurian, the molecules adsorbed at the surface of shale sheet are significantly different from petroleum hydrocarbons this result are also supported with gas-liquid chromatography showed that the study extract is a hydroxypropyl.Keywords: physic-chemical analysis, reservoirs characterization, sweet window evaluation, Silurian shale, Ahnet basin
Procedia PDF Downloads 991278 A Study on Exploring and Prioritizing Critical Risks in Construction Project Assessment
Authors: A. Swetha
Abstract:
This study aims to prioritize and explore critical risks in construction project assessment, employing the Weighted Average Index method and Principal Component Analysis (PCA). Through extensive literature review and expert interviews, project assessment risk factors were identified across Budget and Cost Management Risk, Schedule and Time Management Risk, Scope and Planning Risk, Safety and Regulatory Compliance Risk, Resource Management Risk, Communication and Stakeholder Management Risk, and Environmental and Sustainability Risk domains. A questionnaire was distributed to stakeholders involved in construction activities in Hyderabad, India, with 180 completed responses analyzed using the Weighted Average Index method to prioritize risk factors. Subsequently, PCA was used to understand relationships between these factors and uncover underlying patterns. Results highlighted dependencies on critical resources, inadequate risk assessment, cash flow constraints, and safety concerns as top priorities, while factors like currency exchange rate fluctuations and delayed information dissemination ranked lower but remained significant. These insights offer valuable guidance for stakeholders to mitigate risks effectively and enhance project outcomes. By adopting systematic risk assessment and management approaches, construction projects in Hyderabad and beyond can navigate challenges more efficiently, ensuring long-term viability and resilience.Keywords: construction project assessment risk factor, risk prioritization, weighted average index, principal component analysis, project risk factors
Procedia PDF Downloads 401277 Effects of High-Intensity Interval Training versus Traditional Rehabilitation Exercises on Functional Outcomes in Patients with Knee Osteoarthritis: A Randomized Controlled Trial
Authors: Ahmed Torad
Abstract:
Background: Knee osteoarthritis (OA) is a prevalent musculoskeletal condition characterized by pain and functional impairment. While various rehabilitation approaches have been employed, the effectiveness of high-intensity interval training (HIIT) compared to traditional rehabilitation exercises remains unclear. Objective: This randomized controlled trial aimed to compare the effects of HIIT and traditional rehabilitation exercises on pain reduction, functional improvement, and quality of life in individuals with knee OA. Methods: A total of 120 participants diagnosed with knee OA were randomly allocated into two groups: the HIIT group (n=60) and the traditional rehabilitation group (n=60). The HIIT group participated in a 12-week supervised program consisting of high-intensity interval exercises, while the traditional rehabilitation group followed a conventional physiotherapy regimen. Outcome measures included visual analog scale (VAS) pain scores, Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), and the Short Form-36 Health Survey (SF-36) at baseline and after the intervention period. Results: Both groups showed significant improvements in pain scores, functional outcomes (WOMAC), and quality of life (SF-36) after 12 weeks of intervention. However, the HIIT group demonstrated superior pain reduction (p<0.001), functional improvement (p<0.001), and physical health-related quality of life (p=0.002) compared to the traditional rehabilitation group. No significant differences were observed in mental health-related quality of life between the two groups. Conclusion: High-intensity interval training appears to be a more effective rehabilitation approach than traditional exercises for individuals with knee osteoarthritis, resulting in greater pain reduction, improved function, and enhanced physical health-related quality of life. These findings suggest that HIIT may represent a promising intervention strategy for managing knee OA and enhancing the overall well-being of affected individuals.Keywords: knee osteoarthritis, high-intensity interval training, traditional rehabilitation exercises, randomized controlled trial, pain reduction, functional improvement, quality of life
Procedia PDF Downloads 751276 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination
Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan
Abstract:
The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility
Procedia PDF Downloads 1241275 Time-Interval between Rectal Cancer Surgery and Reintervention for Anastomotic Leakage and the Effects of a Defunctioning Stoma: A Dutch Population-Based Study
Authors: Anne-Loes K. Warps, Rob A. E. M. Tollenaar, Pieter J. Tanis, Jan Willem T. Dekker
Abstract:
Anastomotic leakage after colorectal cancer surgery remains a severe complication. Early diagnosis and treatment are essential to prevent further adverse outcomes. In the literature, it has been suggested that earlier reintervention is associated with better survival, but anastomotic leakage can occur with a highly variable time interval to index surgery. This study aims to evaluate the time-interval between rectal cancer resection with primary anastomosis creation and reoperation, in relation to short-term outcomes, stratified for the use of a defunctioning stoma. Methods: Data of all primary rectal cancer patients that underwent elective resection with primary anastomosis during 2013-2019 were extracted from the Dutch ColoRectal Audit. Analyses were stratified for defunctioning stoma. Anastomotic leakage was defined as a defect of the intestinal wall or abscess at the site of the colorectal anastomosis for which a reintervention was required within 30 days. Primary outcomes were new stoma construction, mortality, ICU admission, prolonged hospital stay and readmission. The association between time to reoperation and outcome was evaluated in three ways: Per 2 days, before versus on or after postoperative day 5 and during primary versus readmission. Results: In total 10,772 rectal cancer patients underwent resection with primary anastomosis. A defunctioning stoma was made in 46.6% of patients. These patients had a lower anastomotic leakage rate (8.2% vs. 11.6%, p < 0.001) and less often underwent a reoperation (45.3% vs. 88.7%, p < 0.001). Early reoperations (< 5 days) had the highest complication and mortality rate. Thereafter the distribution of adverse outcomes was more spread over the 30-day postoperative period for patients with a defunctioning stoma. Median time-interval from primary resection to reoperation for defunctioning stoma patients was 7 days (IQR 4-14) versus 5 days (IQR 3-13 days) for no-defunctioning stoma patients. The mortality rate after primary resection and reoperation were comparable (resp. for defunctioning vs. no-defunctioning stoma 1.0% vs. 0.7%, P=0.106 and 5.0% vs. 2.3%, P=0.107). Conclusion: This study demonstrated that early reinterventions after anastomotic leakage are associated with worse outcomes (i.e. mortality). Maybe the combination of a physiological dip in the cellular immune response and release of cytokines following surgery, as well as a release of endotoxins caused by the bacteremia originating from the leakage, leads to a more profound sepsis. Another explanation might be that early leaks are not contained to the pelvis, leading to a more profound sepsis requiring early reoperations. Leakage with or without defunctioning stoma resulted in a different type of reinterventions and time-interval between surgery and reoperation.Keywords: rectal cancer surgery, defunctioning stoma, anastomotic leakage, time-interval to reoperation
Procedia PDF Downloads 1381274 Effects of High Intensity Interval vs. Low Intensity Continuous Training on LXRβ, ABCG5 and ABCG8 Genes Expression in Male Wistar Rats
Authors: Sdiqeh Jalali, M. R. Khazdair
Abstract:
Liver X receptors (LXR) have an essential role in the regulation of cholesterol metabolism, and their activation increase ABCG5 and ABCG8 genes expression for the improvement of cholesterol excretion from the body during reverse cholesterol transport (RCT). The aim of this study was to investigate the effects of high-intensity interval (HIT) and low intensity continuous (LIT) trainings on gene expression of these substances after a high-fat diet in Wistar rats. Materials and Methods: Fifteen male Wistar rats were divided into 3 groups: control group (n = 5), HIT exercise group (n = 5) and LIT exercise group (n = 5). All groups used a high-fat diet for 13 weeks, and the HIT and LIT groups performed the specific training program. The expression of LXRβ, ABCG5, and ABCG8 genes was measured after the training period. Findings: Data analysis showed significantly higher levels of LXRβ, ABCG5, and ABCG8 gene expression in HIT and LIT groups compared to the control group (P ≤ 0.05). Conclusion: HIT and LIT trainings after a high-fat diet have beneficial effects on RCT that prevent heart attack. Also, HIT training may have a greater effect on cholesterol excretion during the reverse cholesterol transport mechanism than LIT.Keywords: liver X receptor, atherosclerosis, interval training, endurance training
Procedia PDF Downloads 1171273 Proposals of Exposure Limits for Infrasound From Wind Turbines
Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz
Abstract:
Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).Keywords: infrasound, exposure limit, hearing thresholds, wind turbines
Procedia PDF Downloads 831272 E-Hailing Taxi Industry Management Mode Innovation Based on the Credit Evaluation
Authors: Yuan-lin Liu, Ye Li, Tian Xia
Abstract:
There are some shortcomings in Chinese existing taxi management modes. This paper suggests to establish the third-party comprehensive information management platform and put forward an evaluation model based on credit. Four indicators are used to evaluate the drivers’ credit, they are passengers’ evaluation score, driving behavior evaluation, drivers’ average bad record number, and personal credit score. A weighted clustering method is used to achieve credit level evaluation for taxi drivers. The management of taxi industry is based on the credit level, while the grade of the drivers is accorded to their credit rating. Credit rating determines the cost, income levels, the market access, useful period of license and the level of wage and bonus, as well as violation fine. These methods can make the credit evaluation effective. In conclusion, more credit data will help to set up a more accurate and detailed classification standard library.Keywords: credit, mobile internet, e-hailing taxi, management mode, weighted cluster
Procedia PDF Downloads 3251271 The Moment of the Optimal Average Length of the Multivariate Exponentially Weighted Moving Average Control Chart for Equally Correlated Variables
Authors: Edokpa Idemudia Waziri, Salisu S. Umar
Abstract:
The Hotellng’s T^2 is a well-known statistic for detecting a shift in the mean vector of a multivariate normal distribution. Control charts based on T have been widely used in statistical process control for monitoring a multivariate process. Although it is a powerful tool, the T statistic is deficient when the shift to be detected in the mean vector of a multivariate process is small and consistent. The Multivariate Exponentially Weighted Moving Average (MEWMA) control chart is one of the control statistics used to overcome the drawback of the Hotellng’s T statistic. In this paper, the probability distribution of the Average Run Length (ARL) of the MEWMA control chart when the quality characteristics exhibit substantial cross correlation and when the process is in-control and out-of-control was derived using the Markov Chain algorithm. The derivation of the probability functions and the moments of the run length distribution were also obtained and they were consistent with some existing results for the in-control and out-of-control situation. By simulation process, the procedure identified a class of ARL for the MEWMA control when the process is in-control and out-of-control. From our study, it was observed that the MEWMA scheme is quite adequate for detecting a small shift and a good way to improve the quality of goods and services in a multivariate situation. It was also observed that as the in-control average run length ARL0¬ or the number of variables (p) increases, the optimum value of the ARL0pt increases asymptotically and as the magnitude of the shift σ increases, the optimal ARLopt decreases. Finally, we use the example from the literature to illustrate our method and demonstrate its efficiency.Keywords: average run length, markov chain, multivariate exponentially weighted moving average, optimal smoothing parameter
Procedia PDF Downloads 4221270 Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator
Authors: Wedad Albalawi
Abstract:
The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is defined as a closed subset contains real numbers. Then the inequalities of time scales version have received a lot of attention and has had a major field in both pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on double integrals to obtain new time-scale inequalities of Copson driven by Steklov operator. They will be applied in the solution of the Cauchy problem for the wave equation. The proof can be done by introducing restriction on the operator in several cases. In addition, the obtained inequalities done by using some concepts in time scale version such as time scales calculus, theorem of Fubini and the inequality of H¨older.Keywords: time scales, inequality of Hardy, inequality of Coposon, Steklov operator
Procedia PDF Downloads 761269 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach
Authors: Helen L. Hein, Joachim Schwarte
Abstract:
As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.Keywords: aerogel-based, insulating material, early development phase, interval arithmetic
Procedia PDF Downloads 1401268 Optimization of Smart Beta Allocation by Momentum Exposure
Authors: J. B. Frisch, D. Evandiloff, P. Martin, N. Ouizille, F. Pires
Abstract:
Smart Beta strategies intend to be an asset management revolution with reference to classical cap-weighted indices. Indeed, these strategies allow a better control on portfolios risk factors and an optimized asset allocation by taking into account specific risks or wishes to generate alpha by outperforming indices called 'Beta'. Among many strategies independently used, this paper focuses on four of them: Minimum Variance Portfolio, Equal Risk Contribution Portfolio, Maximum Diversification Portfolio, and Equal-Weighted Portfolio. Their efficiency has been proven under constraints like momentum or market phenomenon, suggesting a reconsideration of cap-weighting. To further increase strategy return efficiency, it is proposed here to compare their strengths and weaknesses inside time intervals corresponding to specific identifiable market phases, in order to define adapted strategies depending on pre-specified situations. Results are presented as performance curves from different combinations compared to a benchmark. If a combination outperforms the applicable benchmark in well-defined actual market conditions, it will be preferred. It is mainly shown that such investment 'rules', based on both historical data and evolution of Smart Beta strategies, and implemented according to available specific market data, are providing very interesting optimal results with higher return performance and lower risk. Such combinations have not been fully exploited yet and justify present approach aimed at identifying relevant elements characterizing them.Keywords: smart beta, minimum variance portfolio, equal risk contribution portfolio, maximum diversification portfolio, equal weighted portfolio, combinations
Procedia PDF Downloads 3401267 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City
Authors: Liang Weichien
Abstract:
Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation
Procedia PDF Downloads 2861266 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis
Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen
Abstract:
Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection
Procedia PDF Downloads 3061265 Electrical Cardiac Remodeling in Elite Athletes: A Comparative Study between Triathletes and Cyclists
Authors: Lingxia Li, Frédéric Schnell, Thibault Lachard, Anne-Charlotte Dupont, Shuzhe Ding, Solène Le Douairon Lahaye
Abstract:
Background: Repetitive participation in triathlon training results in significant myocardial changes. However, whether the cardiac remodeling in triathletes is related to the specificities of the sport (consisting of three sports) raises questions. Methods: Elite triathletes and cyclists registered on the French ministerial lists of high-level athletes were involved. The basic information and routine electrocardiogram records were obtained. Electrocardiograms were evaluated according to clinical criteria. Results: Of the 105 athletes included in the study, 42 were from the short-distance triathlon (40%), and 63 were from the road cycling (60%). The average age was 22.1±4.2 years. The P wave amplitude was significantly lower in triathletes than in cyclists (p=0.005), and no significant statistical difference was found in heart rate, RR interval, PR or PQ interval, QRS complex, QRS axe, QT interval, and QTc (p>0.05). All the measured parameters were within normal ranges. The most common electrical manifestations were early repolarization (60.95%) and incomplete right bundle branch block (43.81%); there was no statistical difference between the groups (p>0.05). Conclusions: Prolonged intensive endurance exercise training induces physiological cardiac remodeling in both triathletes and cyclists. The most common electrocardiogram manifestations were early repolarization and incomplete right bundle branch block.Keywords: cardiac screening, electrocardiogram, triathlon, cycling, elite athletes
Procedia PDF Downloads 41264 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status
Authors: Rosa Figueroa, Christopher Flores
Abstract:
Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm
Procedia PDF Downloads 2971263 Multi-Criteria Goal Programming Model for Sustainable Development of India
Authors: Irfan Ali, Srikant Gupta, Aquil Ahmed
Abstract:
Every country needs a sustainable development (SD) for its economic growth by forming suitable policies and initiative programs for the development of different sectors of the country. This paper is comprised of modeling and optimization of different sectors of India that form a multi-criterion model. In this paper, we developed a fractional goal programming (FGP) model that helps in providing the efficient allocation of resources simultaneously by achieving the sustainable goals in gross domestic product (GDP), electricity consumption (EC) and greenhouse gasses (GHG) emission by the year 2030. Also, a weighted model of FGP is presented to obtain varying solution according to the priorities set by the policy maker for achieving future goals of GDP growth, EC, and GHG emission. The presented models provide a useful insight to the decision makers for implementing strategies in a different sector.Keywords: sustainable and economic development, multi-objective fractional programming, fuzzy goal programming, weighted fuzzy goal programming
Procedia PDF Downloads 2231262 The Possible Role of the Endoneurial Fibroblast-like Cells in Resolution of the Endoneurial Edema Following Nerve Crush Injury
Authors: Faris M. Altaf, Abdullah M Elkeshy
Abstract:
Forty-two albino male rats aged between 30 and 40 days (weighted 200 g to 250 g) were used in the present study. The left sural nerves of 36 rats were subjected to crush injury at 1 to 6 weeks intervals using 6 animals at each interval. The right and left sural nerves of the rest 6 rats were used as a control. After 2 weeks of the crush injury, the endoneurium showed channel-like spaces that were lined by the fibroblast-like cells and collagen bundles. These channels contained degenerated myelin and were connected with the perivascular and subperineurial spaces. Some of the flattened fibroblast-like cells were arranged in several layers in the subperineurial and perivascular spaces, forming barrier-like cellular sheets localizing the endoneurial edema in these spaces. Fibroblast-like cells also wrapped the regenerating nerve fibers by their branching cytoplasmic processes. At the end of the third week, the flattened fibroblasts formed nearly continuous sheets in the subperineurial and perivascular spaces. Macrophages were frequently noticed between these cellular barrier-like sheets and in the subperineurial and perivascular spaces. Conclusion: it could be concluded that the endoneurial fibroblast-like cells form barrier-like cellular sheets that localized the endoneurial edema in the subperineurial and perivascular spaces and create also the endoneurial channel-like spaces containing degenerated myelin and endoneurial edema helping the resolution of such edema.Keywords: sural nerve, endoneurial fibroblast-like cells, endoneurial edema, barrier-like and channel-like spaces
Procedia PDF Downloads 3431261 Using Interval Type-2 Fuzzy Controller for Diabetes Mellitus
Authors: Nafiseh Mollaei, Reihaneh Kardehi Moghaddam
Abstract:
In case of Diabetes Mellitus the controlling of insulin is very difficult. This illness is an incurable disease affecting millions of people worldwide. Glucose is a sugar which provides energy to the cells. Insulin is a hormone which supports the absorption of glucose. Fuzzy control strategy is attractive for glucose control because it mimics the first and second phase responses that the pancreas beta cells use to control glucose. We propose two control algorithms a type-1 fuzzy controller and an interval type-2 fuzzy method for the insulin infusion. The closed loop system has been simulated for different patients with different parameters, in present of the food intake disturbance and it has been shown that the blood glucose concentrations at a normoglycemic level of 110 mg/dl in the reasonable amount of time. This paper deals with type 1 diabetes as a nonlinear model, which has been simulated in MATLAB-SIMULINK environment. The novel model, termed the Augmented Minimal Model is used in the simulations. There are some uncertainties in this model due to factors such as blood glucose, daily meals or sudden stress. In addition to eliminate the effects of uncertainty, different control methods may be utilized. In this article, fuzzy controller performance were assessed in terms of its ability to track a normoglycemic set point (110 mg/dl) in response to a [0-10] g meal disturbance. Finally, the development reported in this paper is supposed to simplify the insulin delivery, so increasing the quality of life of the patient.Keywords: interval type-2, fuzzy controller, minimal augmented model, uncertainty
Procedia PDF Downloads 4281260 Optimal Design for SARMA(P,Q)L Process of EWMA Control Chart
Authors: Yupaporn Areepong
Abstract:
The main goal of this paper is to study Statistical Process Control (SPC) with Exponentially Weighted Moving Average (EWMA) control chart when observations are serially-correlated. The characteristic of control chart is Average Run Length (ARL) which is the average number of samples taken before an action signal is given. Ideally, an acceptable ARL of in-control process should be enough large, so-called (ARL0). Otherwise it should be small when the process is out-of-control, so-called Average of Delay Time (ARL1) or a mean of true alarm. We find explicit formulas of ARL for EWMA control chart for Seasonal Autoregressive and Moving Average processes (SARMA) with Exponential white noise. The results of ARL obtained from explicit formula and Integral equation are in good agreement. In particular, this formulas for evaluating (ARL0) and (ARL1) be able to get a set of optimal parameters which depend on smoothing parameter (λ) and width of control limit (H) for designing EWMA chart with minimum of (ARL1).Keywords: average run length, optimal parameters, exponentially weighted moving average (EWMA), control chart
Procedia PDF Downloads 5601259 Detection Method of Federated Learning Backdoor Based on Weighted K-Medoids
Authors: Xun Li, Haojie Wang
Abstract:
Federated learning is a kind of distributed training and centralized training mode, which is of great value in the protection of user privacy. In order to solve the problem that the model is vulnerable to backdoor attacks in federated learning, a backdoor attack detection method based on a weighted k-medoids algorithm is proposed. First of all, this paper collates the update parameters of the client to construct a vector group, then uses the principal components analysis (PCA) algorithm to extract the corresponding feature information from the vector group, and finally uses the improved k-medoids clustering algorithm to identify the normal and backdoor update parameters. In this paper, the backdoor is implanted in the federation learning model through the model replacement attack method in the simulation experiment, and the update parameters from the attacker are effectively detected and removed by the defense method proposed in this paper.Keywords: federated learning, backdoor attack, PCA, k-medoids, backdoor defense
Procedia PDF Downloads 1141258 Effect of Serum Electrolytes on a QTc Interval and Mortality in Patients admitted to Coronary Care Unit
Authors: Thoetchai Peeraphatdit, Peter A. Brady, Suraj Kapa, Samuel J. Asirvatham, Niyada Naksuk
Abstract:
Background: Serum electrolyte abnormalities are a common cause of an acquired prolonged QT syndrome, especially, in the coronary care unit (CCU) setting. Optimal electrolyte ranges among the CCU patients have not been sufficiently investigated. Methods: We identified 8,498 consecutive CCU patients who were admitted to the CCU at Mayo Clinic, Rochester, the USA, from 2004 through 2013. Association between first serum electrolytes and baseline corrected QT intervals (QTc), as well as in-hospital mortality, was tested using multivariate linear regression and logistic regression, respectively. Serum potassium 4.0- < 4.5 mEq/L, ionized calcium (iCa) 4.6-4.8 mg/dL, and magnesium 2.0- < 2.2 mg/dL were used as the reference levels. Results: There was a modest level-dependent relationship between hypokalemia ( < 4.0 mEq/L), hypocalcemia ( < 4.4 mg/dL), and a prolonged QTc interval; serum magnesium did not affect the QTc interval. Association between the serum electrolytes and in-hospital mortality included a U-shaped relationship for serum potassium (adjusted odds ratio (OR) 1.53 and OR 1.91for serum potassium 4.5- < 5.0 and ≥ 5.0 mEq/L, respectively) and an inverted J-shaped relationship for iCa (adjusted OR 2.79 and OR 2.03 for calcium < 4.4 and 4.4- < 4.6 mg/dL, respectively). For serum magnesium, the mortality was greater only among patients with levels ≥ 2.4 mg/dL (adjusted OR 1.40), compared to the reference level. Findings were similar in sensitivity analyses examining the association between mean serum electrolytes and mean QTc intervals, as well as in-hospital mortality. Conclusions: Serum potassium 4.0- < 4.5 mEq/L, iCa ≥ 4.6 mg/dL, and magnesium < 2.4 mg/dL had a neutral effect on QTc intervals and were associated with the lowest in-hospital mortality among the CCU patients.Keywords: calcium, electrocardiography, long-QT syndrome, magnesium, mortality, potassium
Procedia PDF Downloads 3941257 Investigating Climate Change Trend Based on Data Simulation and IPCC Scenario during 2010-2030 AD: Case Study of Fars Province
Authors: Leila Rashidian, Abbas Ebrahimi
Abstract:
The development of industrial activities, increase in fossil fuel consumption, vehicles, destruction of forests and grasslands, changes in land use, and population growth have caused to increase the amount of greenhouse gases especially CO2 in the atmosphere in recent decades. This has led to global warming and climate change. In the present paper, we have investigated the trend of climate change according to the data simulation during the time interval of 2010-2030 in the Fars province. In this research, the daily climatic parameters such as maximum and minimum temperature, precipitation and number of sunny hours during the 1977-2008 time interval for synoptic stations of Shiraz and Abadeh and during 1995-2008 for Lar stations and also the output of HADCM3 model in 2010-2030 time interval have been used based on the A2 propagation scenario. The results of the model show that the average temperature will increase by about 1 degree centigrade and the amount of precipitation will increase by 23.9% compared to the observational data. In conclusion, according to the temperature increase in this province, the amount of precipitation in the form of snow will be reduced and precipitations often will occur in the form of rain. This 1-degree centigrade increase during the season will reduce production by 6 to 10% because of shortening the growing period of wheat.Keywords: climate change, Lars WG, HADCM3, Gillan province, climatic parameters, A2 scenario
Procedia PDF Downloads 2151256 Community-Based Reference Interval of Selected Clinical Chemistry Parameters Among Apparently Healthy Adolescents in Mekelle City, Tigrai, Northern Ethiopia
Authors: Getachew Belay Kassahun
Abstract:
Background: Locally established clinical laboratory reference intervals (RIs) are required to interpret laboratory test results for screening, diagnosis, and prognosis. The objective of this study was to establish a reference interval of clinical chemistry parameters among apparently healthy adolescents aged between 12 and 17 years in Mekelle, Tigrai, in the northern part of Ethiopia. Methods: Community-based cross-sectional study was employed from December 2018 to March 2019 in Mekelle City among 172 males and 172 females based on a Multi-stage sampling technique. Blood samples were tested for Fasting blood sugar (FBS), alanine amino transferase (ALT), aspartate aminotransferase (AST), alkaline phosphatase (ALP), Creatinine, urea, total protein, albumin (ALB), direct and indirect bilirubin (BIL.D and BIL.T) using 25 Bio system clinical chemistry analyzer. Results were analyzed using SPSS version 23 software and based on the Clinical Laboratory Standard Institute (CLSI)/ International Federation of Clinical Chemistry (IFCC) C 28-A3 Guideline which defines the reference interval as the 95% central range of 2.5th and 97.5th percentiles. Mann Whitney U test, descriptive statistics and box and whisker were statistical tools used for analysis. Results: This study observed statistically significant differences between males and females in ALP, ALT, AST, Urea and Creatinine Reference intervals. The established reference intervals for males and females, respectively, were: ALP (U/L) 79.48-492.12 versus 63.56-253.34, ALT (U/L) 4.54-23.69 versus 5.1-20.03, AST 15.7- 39.1 versus 13.3- 28.5, Urea (mg/dL) 9.33-24.99 versus 7.43-23.11, and Creatinine (mg/dL) 0.393-0.957 versus 0.301-0.846. The combined RIs for Total Protein (g/dL) were 6.08-7.85, ALB (g/dL) 4.42-5.46, FBS(mg/dL) 65-110, BIL.D (mg/dL) 0.033-0.532, and BIL.T (mg/dL) 0.106-0.812. Conclusions: The result showed a marked difference between sex and company-derived values for selected clinical chemistry parameters. Thus, the use of age and sex-specific locally established reference intervals for clinical chemistry parameters is recommended.Keywords: reference interval, adolescent, clinical chemistry, Ethiopia
Procedia PDF Downloads 791255 A Preliminary Study of the Effects of Abiotic Environmental Variables on Early Diptera Carrion Colonizers in Algiers, Algeria
Authors: M. Taleb, G. Tail, F. Z. Kara, B. Djedouani T. Moussa
Abstract:
Necrophagous insects usually colonize cadavers within a short time after death. However, they are influenced by weather conditions, and their distribution and activity vary according to different time scales, which can affect the post-mortem interval (PMI) estimation. As no data have been published in Algeria on necrophagous insects visiting corpses, two field surveys were conducted in July 2012 and March 2013 at the National Institute for Criminalistics and Criminology (INCC) using rabbit carcasses (Oryctolagus cuniculus L.). The trials were designed to identify the necrophagous Diptera fauna of Algiers, Algeria and examine their variations according to environmental variables. Four hundred and eighteen Diptera adults belonging to five families were captured during this study. The species which were identified on human corpses in different regions of Algeria were also observed on the rabbit carcasses. Although seasonal variations of the species were observed, their abundance did not significantly vary between the two seasons. In addition to seasonal effects, the ambient temperature, the wind speed, and precipitation affect the number of trapped flies. These conclusions highlight the necessity of considering the environmental factors at a scene to estimate the post-mortem interval accurately. It is hoped that these findings provide basic information regarding the necrophagous Diptera fauna of Algeria.Keywords: forensic entomology, necrophagous diptera, post-mortem interval, abiotic factors, Algeria
Procedia PDF Downloads 3881254 Evaluation of Fetal brain using Magnetic Resonance Imaging
Authors: Mahdi Farajzadeh Ajirlou
Abstract:
Ordinary fetal brain development can be considered by in vivo attractive reverberation imaging (MRI) from the 18th gestational week (GW) to term and depends fundamentally on T2-weighted and diffusion-weighted (DW) arrangements. The foremost commonly suspected brain pathologies alluded to fetal MRI for assist assessment are ventriculomegaly, lost corpus callosum, and anomalies of the posterior fossa. Brain division could be a crucial to begin with step in neuroimage examination. Within the case of fetal MRI it is especially challenging and critical due to the subjective introduction of the hatchling, organs that encompass the fetal head, and irregular fetal movement. A few promising strategies have been proposed but are constrained in their execution in challenging cases and in realtime division. Fetal MRI is routinely performed on a 1.5-Tesla scanner without maternal or fetal sedation. The mother lies recumbent amid the course of the examination, the length of which is ordinarily 45 to 60 minutes. The accessibility and continuous approval of standardizing fetal brain development directions will give critical devices for early discovery of impeded fetal brain development upon which to oversee high-risk pregnancies.Keywords: brain, fetal, MRI, imaging
Procedia PDF Downloads 791253 Forecasting Issues in Energy Markets within a Reg-ARIMA Framework
Authors: Ilaria Lucrezia Amerise
Abstract:
Electricity markets throughout the world have undergone substantial changes. Accurate, reliable, clear and comprehensible modeling and forecasting of different variables (loads and prices in the first instance) have achieved increasing importance. In this paper, we describe the actual state of the art focusing on reg-SARMA methods, which have proven to be flexible enough to accommodate the electricity price/load behavior satisfactory. More specifically, we will discuss: 1) The dichotomy between point and interval forecasts; 2) The difficult choice between stochastic (e.g. climatic variation) and non-deterministic predictors (e.g. calendar variables); 3) The confrontation between modelling a single aggregate time series or creating separated and potentially different models of sub-series. The noteworthy point that we would like to make it emerge is that prices and loads require different approaches that appear irreconcilable even though must be made reconcilable for the interests and activities of energy companies.Keywords: interval forecasts, time series, electricity prices, reg-SARIMA methods
Procedia PDF Downloads 1311252 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method
Authors: Luh Eka Suryani, Purhadi
Abstract:
Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion
Procedia PDF Downloads 1591251 Multi-Objective Variable Neighborhood Search Algorithm to Solving Scheduling Problem with Transportation Times
Authors: Majid Khalili
Abstract:
This paper deals with a bi-objective hybrid no-wait flowshop scheduling problem minimizing the makespan and total weighted tardiness, in which we consider transportation times between stages. Obtaining an optimal solution for this type of complex, large-sized problem in reasonable computational time by using traditional approaches and optimization tools is extremely difficult. This paper presents a new multi-objective variable neighborhood algorithm (MOVNS). A set of experimental instances are carried out to evaluate the algorithm by advanced multi-objective performance measures. The algorithm is carefully evaluated for its performance against available algorithm by means of multi-objective performance measures and statistical tools. The related results show that a variant of our proposed MOVNS provides sound performance comparing with other algorithms. Procedia PDF Downloads 4181250 Data-Driven Performance Evaluation of Surgical Doctors Based on Fuzzy Analytic Hierarchy Processes
Authors: Yuguang Gao, Qiang Yang, Yanpeng Zhang, Mingtao Deng
Abstract:
To enhance the safety, quality and efficiency of healthcare services provided by surgical doctors, we propose a comprehensive approach to the performance evaluation of individual doctors by incorporating insights from performance data as well as views of different stakeholders in the hospital. Exploratory factor analysis was first performed on collective multidimensional performance data of surgical doctors, where key factors were extracted that encompass assessment of professional experience and service performance. A two-level indicator system was then constructed, for which we developed a weighted interval-valued spherical fuzzy analytic hierarchy process to analyze the relative importance of the indicators while handling subjectivity and disparity in the decision-making of multiple parties involved. Our analytical results reveal that, for the key factors identified as instrumental for evaluating surgical doctors’ performance, the overall importance of clinical workload and complexity of service are valued more than capacity of service and professional experience, while the efficiency of resource consumption ranks comparatively the lowest in importance. We also provide a retrospective case study to illustrate the effectiveness and robustness of our quantitative evaluation model by assigning meaningful performance ratings to individual doctors based on the weights developed through our approach.Keywords: analytic hierarchy processes, factor analysis, fuzzy logic, performance evaluation
Procedia PDF Downloads 58