Search results for: channel error correction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3368

Search results for: channel error correction

518 Cardiac Protective Effect of Olive Oil against Ischemia Reperfusion- Induced Cardiac Arrhythmias in Isolated Diabetic Rat Hearts

Authors: Ishfaq A. Bukhari, Bassem Yousef Sheikh, Abdulrahman Almotrefi, Osama Yousaf, Amer Mahmood

Abstract:

Olive oil is the primary source of fat in the Mediterranean diet which is associated with a low mortality for cardiovascular disease. Olive oil is rich in monounsaturated fatty acids, and has been reported for variety of beneficial cardiovascular effects including blood pressure lowering, anti-platelet, anti-diabetic and anti-inflammatory effects. Growing number evidences from preclinical and clinical studies have shown that olive oil improves insulin resistance, decrease vessels stiffness and prevent thromboembolism. We evaluated the effects of olive against streptozotocin-induced physiological disorders in the animal models of diabetes and ischemia and reperfusion (I/R)- induced cardiac arrhythmias. Diabetes was induced in male rats with a single intraperitoneal injection of streptozotocin (60 mg/kg), rats were treated for two months with olive oil (1 ml/kg p.o). Control animals received saline. Blood glucose, body weight were monitored every 14 days. At the end of the treatment rats were sacrificed hearts were isolated for mounting on langedorff’s apparatus. The blood glucose and body weight was not significantly different in the control and olive treated animals. The control diabetic animals exhibited 100% incidence of I/R –induced ventricular fibrillation which was reduced to 0% with olive oil, treatment. The duration of ventricular fibrillation reduced from 98.8± 2.3 (control) to 0 seconds in the olive oil treated group. Diltiazem, a calcium channel blocker (1 µm/L) showed similar results and protected the I/R-induced cardiac disorders. The biochemical analysis of the cardiac tissues showed that diabetes and I/R produce marked pathological changes in the cardiomyocytes including decreased glutathione (GSH) and increased oxidative stress (Malondialdehyde; MDA). Pretreatment of animals with olive oil (1 ml/kg p.o) increased GSH and MDA levels. Olive oil also improved the diabetic-induced histopathological changes in the cardiomyocytes. These finding indicates that olive possesses cardiac protective properties. Further studies are under way in our lab to explore the mechanism of the cardio-protective effect of olive oil.

Keywords: diabeties, ischemia-reperfusion, olive oil, rats heart

Procedia PDF Downloads 443
517 Comparison Approach for Wind Resource Assessment to Determine Most Precise Approach

Authors: Tasir Khan, Ishfaq Ahmad, Yejuan Wang, Muhammad Salam

Abstract:

Distribution models of the wind speed data are essential to assess the potential wind speed energy because it decreases the uncertainty to estimate wind energy output. Therefore, before performing a detailed potential energy analysis, the precise distribution model for data relating to wind speed must be found. In this research, material from numerous criteria goodness-of-fits, such as Kolmogorov Simonov, Anderson Darling statistics, Chi-Square, root mean square error (RMSE), AIC and BIC were combined finally to determine the wind speed of the best-fitted distribution. The suggested method collectively makes each criterion. This method was useful in a circumstance to fitting 14 distribution models statistically with the data of wind speed together at four sites in Pakistan. The consequences show that this method provides the best source for selecting the most suitable wind speed statistical distribution. Also, the graphical representation is consistent with the analytical results. This research presents three estimation methods that can be used to calculate the different distributions used to estimate the wind. In the suggested MLM, MOM, and MLE the third-order moment used in the wind energy formula is a key function because it makes an important contribution to the precise estimate of wind energy. In order to prove the presence of the suggested MOM, it was compared with well-known estimation methods, such as the method of linear moment, and maximum likelihood estimate. In the relative analysis, given to several goodness-of-fit, the presentation of the considered techniques is estimated on the actual wind speed evaluated in different time periods. The results obtained show that MOM certainly provides a more precise estimation than other familiar approaches in terms of estimating wind energy based on the fourteen distributions. Therefore, MOM can be used as a better technique for assessing wind energy.

Keywords: wind-speed modeling, goodness of fit, maximum likelihood method, linear moment

Procedia PDF Downloads 64
516 The Effect of Foundation on the Earth Fill Dam Settlement

Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh

Abstract:

Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.

Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting

Procedia PDF Downloads 167
515 Addressing Public Concerns about Radiation Impacts by Looking Back in Nuclear Accidents Worldwide

Authors: Du Kim, Nelson Baro

Abstract:

According to a report of International Atomic Energy Agency (IAEA), there are approximately 437 nuclear power stations are in operation in the present around the world in order to meet increasing energy demands. Indeed, nearly, a third of the world’s energy demands are met through nuclear power because it is one of the most efficient and long-lasting sources of energy. However, there are also consequences when a major event takes place at a nuclear power station. Over the past years, a few major nuclear accidents have occurred around the world. According to a report of International Nuclear and Radiological Event Scale (INES), there are six nuclear accidents that are considered to be high level (risk) of the events: Fukushima Dai-chi (Level 7), Chernobyl (Level 7), Three Mile Island (Level 5), Windscale (Level 5), Kyshtym (Level 6) and Chalk River (Level 5). Today, many people still have doubt about using nuclear power. There is growing number of people who are against nuclear power after the serious accident occurred at the Fukushima Dai-chi nuclear power plant in Japan. In other words, there are public concerns about radiation impacts which emphasize Linear-No-Threshold (LNT) Issues, Radiation Health Effects, Radiation Protection and Social Impacts. This paper will address those keywords by looking back at the history of these major nuclear accidents worldwide, based on INES. This paper concludes that all major mistake from nuclear accidents are preventable due to the fact that most of them are caused by human error. In other words, the human factor has played a huge role in the malfunction and occurrence of most of those events. The correct handle of a crisis is determined, by having a good radiation protection program in place, it’s what has a big impact on society and determines how acceptable people are of nuclear.

Keywords: linear-no-threshold (LNT) issues, radiation health effects, radiation protection, social impacts

Procedia PDF Downloads 221
514 Development of Peaceful Wellbeing in Executive Practitioners through Mindfulness-Based Practices

Authors: Narumon Jiwattanasuk, Phrakrupalad Pannavoravat, Pataraporn Sirikanchana

Abstract:

Mindfulness has become a perspective addressing positive wellbeing these days. The aims of this paper are to analyze the problems of executive meditation practitioners at the Buddhamahametta Foundation in Thailand and to provide recommendations on the process to develop peaceful wellbeing in executive meditation practitioners by applying the principles of the four foundations of mindfulness. This study is particularly focused on executives because there is not much research focusing on the well-being development of executives, and the researcher recognizes that executives can be an example within their organizations. This would be a significant influence on their employees and their families to be interested in practicing mindfulness. This improvement will then grow from an individual to the surrounding community such as family, workplace, society, and the nation. This would lead to happiness at the national level, which is the expectation of this research. The paper highlights mindfulness practices that can be performed on a daily basis. This study is qualitative research, and there are 10 key participants who are executives from various sectors such as hospitality, healthcare, retail, power energy, and so on. Three mindfulness-based courses were conducted over a period of 8 months, and in-depth interviews were done before the first course as well as at the end of every course. In total, four in-depth interviews were conducted. The information collected from the interviews was analyzed in order to create the process to develop peaceful well-being. Focus group discussions with the mindfulness specialists were conducted to help develop the mindfulness program as well. As a result of this research, it is found that the executives faced the following problems: stress, negative thinking loops, losing temper, seeking acceptance, worry about uncontrollable external factors, unable to control their words, and weight gain. The cultivation of the four foundations of mindfulness can develop peaceful wellbeing. The results showed that after the key informant executives attended the mindfulness courses and practiced mindfulness regularly, they have developed peaceful well-being in all aspects such as physical, psychological, behavioral, and intellectual by applying 12 mindfulness-based activities. The development of wellbeing, in the conclusion of this study, also includes various tools to support the continuing practice, including the handout of guided mindfulness practice, VDO clips about mindfulness practice, the online dhamma channel, and mobile applications to support regular mindfulness-based practices.

Keywords: executive, mindfulness activities, stress, wellbeing

Procedia PDF Downloads 100
513 Hedonic Price Analysis of Consumer Preference for Musa spp in Northern Nigeria

Authors: Yakubu Suleiman, S. A. Musa

Abstract:

The research was conducted to determine the physical characteristics of banana fruits that influenced consumer preferences for the fruit in Northern Nigeria. Socio-economic characteristics of the respondents were also identified. Simple descriptive statistics and Hedonic prices model were used to analyze the data collected for socio-economic and consumer preference respectively with the aid of 1000 structured questionnaires. The result revealed the value of R2 to be 0.633, meaning that, 63.3% of the variation in the banana price was brought about by the explanatory variables included in the model and the variables are: colour, size, degree of ripeness, softness, surface blemish, cleanliness of the fruits, weight, length, and cluster size of fruits. However, the remaining 36.7% could be attributed to the error term or random disturbance in the model. It could also be seen from the calculated result that the intercept was 1886.5 and was statistically significant (P < 0.01), meaning that about N1886.5 worth of banana fruits could be bought by consumers without considering the variables of banana included in the model. Moreover, consumers showed that they have significant preference for colours, size, degree of ripeness, softness, weight, length and cluster size of banana fruits and they were tested to be significant at either P < 0.01, P < 0.05, and P < 0.1 . Moreover, the result also shows that consumers did not show significance preferences to surface blemish, cleanliness and variety of the banana fruit as all of them showed non-significance level with negative signs. Based on the findings of the research, it is hereby recommended that plant breeders and research institutes should concentrate on the production of banana fruits that have those physical characteristics that were found to be statistically significance like cluster size, degree of ripeness,’ softness, length, size, and skin colour.

Keywords: analysis, consumers, preference, variables

Procedia PDF Downloads 306
512 Simulation-Based Validation of Safe Human-Robot-Collaboration

Authors: Titanilla Komenda

Abstract:

Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.

Keywords: human-machine-system, human-robot-collaboration, safety, simulation

Procedia PDF Downloads 338
511 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 126
510 Abdominal Exercises Can Modify Abdominal Function in Postpartum Women: A Randomized Control Trial Comparing Curl-up to Drawing-in Combined With Diaphragmatic Aspiration

Authors: Yollande Sènan Djivoh, Dominique de Jaeger

Abstract:

Background: Abdominal exercises are commonly practised nowadays. Specific techniques of abdominal muscles strengthening like hypopressive exercises have recently emerged and their practice is encouraged against the practice of Curl-up especially in postpartum. The acute and the training effects of these exercises did not allow to advise one exercise to the detriment of another. However, physiotherapists remain reluctant to perform Curl-up with postpartum women because of its potential harmful effect on the pelvic floor. Design: This study was a randomized control trial registered under the number PACTR202110679363984. Objective: to observe the training effect of two experimental protocols (Curl-up versus Drawing-in+Diaphragmatic aspiration) on the abdominal wall (interrecti distance, rectus and transversus abdominis thickness, abdominal strength) in Beninese postpartum women. Pelvic floor function (tone, endurance, urinary incontinence) will be assessed to evaluate potential side effects of exercises on the pelvic floor. Method: Postpartum women diagnosed with diastasis recti were randomly assigned to one of three groups (Curl-up, Drawingin+Diaphragmatic aspiration and control). Abdominal and pelvic floor parameters were assessed before and at the end of the 6-week protocol. The interrecti distance and the abdominal muscles thickness were assessed by ultrasound and abdominal strength by dynamometer. Pelvic floor tone and strength were assessed with Biofeedback and urinary incontinence was quantified by pad test. To compare the results between the three groups and the two measurements, a two-way Anova test with repeated measures was used (p<0.05). When interaction was significant, a posthoc using Student t test, with Bonferroni correction, was used to compare the three groups regarding the difference (end value minus initial value). To complete these results, a paired Student t test was used to compare in each group the initial and end values. Results: Fifty-eight women participated in this study, divided in three groups with similar characteristics regarding their age (29±5 years), parity (2±1 children), BMI (26±4 kg/m2 ), time since the last birth (10±2 weeks), weight of their baby at birth (330±50 grams). Time effect and interaction were significant (p<0.001) for all abdominal parameters. Experimental groups improved more than control group. Curl-up group improved more (p=0.001) than Drawing-in+Diaphragmatic aspiration group regarding the interrecti distance (9.3±4.2 mm versus 6.6±4.6 mm) and abdominal strength (20.4±16.4 Newton versus 11.4±12.8 Newton). Drawingin+Diaphragmatic aspiration group improved (0.8±0.7 mm) more than Curl-up group (0.5±0.7 mm) regarding the transversus abdominis thickness (p=0.001). Only Curl-up group improved (p<0.001) the rectus abdominis thickness (1.5±1.2 mm). For pelvic floor parameters, both experimental groups improved (p=0.01) except for tone which improved (p=0.03) only in Drawing-in+Diaphragmatic aspiration group from 19.9±4.1 cmH2O to 22.2±4.5 cmH2O. Conclusion: Curl-up was more efficient to improve abdominal function than Drawingin+Diaphragmatic aspiration. However, these exercises are complementary. None of them degraded the pelvic floor, but Drawing-in+Diaphragmatic aspiration improved further the pelvic floor function. Clinical implications: Curl-up, Drawing-in and Diaphragmatic aspiration can be used for the management of abdominal function in postpartum women. Exercises must be chosen considering the specific needs of each woman’s abdominal and pelvic floor function.

Keywords: curl-up, drawing-in, diaphragmatic aspiration, hypopressive exercise, postpartum women

Procedia PDF Downloads 61
509 Correlation between Cephalometric Measurements and Visual Perception of Facial Profile in Skeletal Type II Patients

Authors: Choki, Supatchai Boonpratham, Suwannee Luppanapornlarp

Abstract:

The objective of this study was to find a correlation between cephalometric measurements and visual perception of facial profile in skeletal type II patients. In this study, 250 lateral cephalograms of female patients from age, 20 to 22 years were analyzed. The profile outlines of all the samples were hand traced and transformed into silhouettes by the principal investigator. Profile ratings were done by 9 orthodontists on Visual Analogue Scale from score one to ten (increasing level of convexity). 37 hard issue and soft tissue cephalometric measurements were analyzed by the principal investigator. All the measurements were repeated after 2 weeks interval for error assessment. At last, the rankings of visual perceptions were correlated with cephalometric measurements using Spearman correlation coefficient (P < 0.05). The results show that the increase in facial convexity was correlated with higher values of ANB (A point, nasion and B point), AF-BF (distance from A point to B point in mm), L1-NB (distance from lower incisor to NB line in mm), anterior maxillary alveolar height, posterior maxillary alveolar height, overjet, H angle hard tissue, H angle soft tissue and lower lip to E plane (absolute correlation values from 0.277 to 0.711). In contrast, the increase in facial convexity was correlated with lower values of Pg. to N perpendicular and Pg. to NB (mm) (absolute correlation value -0.302 and -0.294 respectively). From the soft tissue measurements, H angles had a higher correlation with visual perception than facial contour angle, nasolabial angle, and lower lip to E plane. In conclusion, the findings of this study indicated that the correlation of cephalometric measurements with visual perception was less than expected. Only 29% of cephalometric measurements had a significant correlation with visual perception. Therefore, diagnosis based solely on cephalometric analysis can result in failure to meet the patient’s esthetic expectation.

Keywords: cephalometric measurements, facial profile, skeletal type II, visual perception

Procedia PDF Downloads 116
508 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 157
507 Surveillance of Adverse Events Following Immunization during New Vaccines Introduction in Cameroon: A Cross-Sectional Study on the Role of Mobile Technology

Authors: Andreas Ateke Njoh, Shalom Tchokfe Ndoula, Amani Adidja, Germain Nguessan Menan, Annie Mengue, Eric Mboke, Hassan Ben Bachir, Sangwe Clovis Nchinjoh, Yauba Saidu, Laurent Cleenewerck De Kiev

Abstract:

Vaccines serve a great deal in protecting the population globally. Vaccine products are subject to rigorous quality control and approval before use to ensure safety. Even if all actors take the required precautions, some people could still have adverse events following immunization (AEFI) caused by the vaccine composition or an error in its administration. AEFI underreporting is pronounced in low-income settings like Cameroon. The Country introduced electronic platforms to strengthen surveillance. With the introduction of many novel vaccines, like COVID-19 and the novel Oral Polio Vaccine (nOPV) 2, there was a need to monitor AEFI in the Country. A cross-sectional study was conducted from July to December 2022. Data on AEFI per region of Cameroon were reviewed for the past five years. Data were analyzed with MS Excel, and the results were presented in proportions. AEFI reporting was uncommon in Cameroon. With the introduction of novel vaccines in 2021, the health authorities engaged in new tools and training to capture cases. AEFI detected almost doubled using the open data kit (ODK) compared to previous platforms, especially following the introduction of the nOPV2 and COVID-19 vaccines. The AEFI rate was 1.9 and 160 per administered 100 000 doses of nOPV2 and COVID-19 vaccines, respectively. This mobile tool captured individual information for people with AEFI from all regions. The platform helped to identify common AEFI following the use of these new vaccines. The ODK mobile technology was vital in improving AEFI reporting and providing data to monitor using new vaccines in Cameroon.

Keywords: adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK

Procedia PDF Downloads 59
506 Creating Database and Building 3D Geological Models: A Case Study on Bac Ai Pumped Storage Hydropower Project

Authors: Nguyen Chi Quang, Nguyen Duong Tri Nguyen

Abstract:

This article is the first step to research and outline the structure of the geotechnical database in the geological survey of a power project; in the context of this report creating the database that has been carried out for the Bac Ai pumped storage hydropower project. For the purpose of providing a method of organizing and storing geological and topographic survey data and experimental results in a spatial database, the RockWorks software is used to bring optimal efficiency in the process of exploiting, using, and analyzing data in service of the design work in the power engineering consulting. Three-dimensional (3D) geotechnical models are created from the survey data: such as stratigraphy, lithology, porosity, etc. The results of the 3D geotechnical model in the case of Bac Ai pumped storage hydropower project include six closely stacked stratigraphic formations by Horizons method, whereas modeling of engineering geological parameters is performed by geostatistical methods. The accuracy and reliability assessments are tested through error statistics, empirical evaluation, and expert methods. The three-dimensional model analysis allows better visualization of volumetric calculations, excavation and backfilling of the lake area, tunneling of power pipelines, and calculation of on-site construction material reserves. In general, the application of engineering geological modeling makes the design work more intuitive and comprehensive, helping construction designers better identify and offer the most optimal design solutions for the project. The database always ensures the update and synchronization, as well as enables 3D modeling of geological and topographic data to integrate with the designed data according to the building information modeling. This is also the base platform for BIM & GIS integration.

Keywords: database, engineering geology, 3D Model, RockWorks, Bac Ai pumped storage hydropower project

Procedia PDF Downloads 139
505 Genetically Informed Precision Drug Repurposing for Rheumatoid Arthritis

Authors: Sahar El Shair, Laura Greco, William Reay, Murray Cairns

Abstract:

Background: Rheumatoid arthritis (RA) is a chronic, systematic, inflammatory, autoimmune disease that involves damages to joints and erosions to the associated bones and cartilage, resulting in reduced physical function and disability. RA is a multifactorial disorder influenced by heterogenous genetic and environmental factors. Whilst different medications have proven successful in reducing inflammation associated with RA, they often come with significant side effects and limited efficacy. To address this, the novel pharmagenic enrichment score (PES) algorithm was tested in self-reported RA patients from the UK Biobank (UKBB), which is a cohort of predominantly European ancestry, and identified individuals with a high genetic risk in clinically actionable biological pathways to identify novel opportunities for precision interventions and drug repurposing to treat RA. Methods and materials: Genetic association data for rheumatoid arthritis was derived from publicly available genome-wide association studies (GWAS) summary statistics (N=97173). The PES framework exploits competitive gene set enrichment to identify pathways that are associated with RA to explore novel treatment opportunities. This data is then integrated into WebGestalt, Drug Interaction database (DGIdb) and DrugBank databases to identify existing compounds with existing use or potential for repurposed use. The PES for each of these candidates was then profiled in individuals with RA in the UKBB (Ncases = 3,719, Ncontrols = 333,160). Results A total of 209 pathways with known drug targets after multiple testing correction were identified. Several pathways, including interferon gamma signaling and TID pathway (which relates to a chaperone that modulates interferon signaling), were significantly associated with self-reported RA in the UKBB when adjusting for age, sex, assessment centre month and location, RA polygenic risk and 10 principal components. These pathways have a major role in RA pathogenesis, including autoimmune attacks against certain citrullinated proteins, synovial inflammation, and bone loss. Encouragingly, many also relate to the mechanism of action of existing RA medications. The analyses also revealed statistically significant association between RA polygenic scores and self-reported RA with individual PES scorings, highlighting the potential utility of the PES algorithm in uncovering additional genetic insights that could aid in the identification of individuals at risk for RA and provide opportunities for more targeted interventions. Conclusions In this study, pharmacologically annotated genetic risk was explored through the PES framework to overcome inter-individual heterogeneity and enable precision drug repurposing in RA. The results showed a statistically significant association between RA polygenic scores and self-reported RA and individual PES scorings for 3,719 RA patients. Interestingly, several enriched PES pathways were targeted by already approved RA drugs. In addition, the analysis revealed genetically supported drug repurposing opportunities for future treatment of RA with a relatively safe profile.

Keywords: rheumatoid arthritis, precision medicine, drug repurposing, system biology, bioinformatics

Procedia PDF Downloads 51
504 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 110
503 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 58
502 In vivo Mechanical Characterization of Facial Skin Combining Digital Image Correlation and Finite Element

Authors: Huixin Wei, Shibin Wang, Linan Li, Lei Zhou, Xinhao Tu

Abstract:

Facial skin is a biomedical material with complex mechanical properties of anisotropy, viscoelasticity, and hyperelasticity. The mechanical properties of facial skin are crucial for a number of applications including facial plastic surgery, animation, dermatology, cosmetic industry, and impact biomechanics. Skin is a complex multi-layered material which can be broadly divided into three main layers, the epidermis, the dermis, and the hypodermis. Collagen fibers account for 75% of the dry weight of dermal tissue, and it is these fibers which are responsible for the mechanical properties of skin. Many research on the anisotropic mechanical properties are mainly concentrated on in vitro, but there is a great difference between in vivo and in vitro for mechanical properties of the skin. In this study, we presented a method to measure the mechanical properties of facial skin in vivo. Digital image correlation (DIC) and indentation tests were used to obtain the experiment data, including the deformation of facial surface and indentation force-displacement curve. Then, the experiment was simulated using a finite element (FE) model. Application of Computed Tomography (CT) and reconstruction techniques obtained the real tissue geometry. A three-dimensional FE model of facial skin, including a bi-layer system, was obtained. As the epidermis is relatively thin, the epidermis and dermis were regarded as one layer and below it was hypodermis in this study. The upper layer was modeled as a Gasser-Ogden-Holzapfel (GOH) model to describe hyperelastic and anisotropic behaviors of the dermis. The under layer was modeled as a linear elastic model. In conclusion, the material properties of two-layer were determined by minimizing the error between the FE data and experimental data.

Keywords: facial skin, indentation test, finite element, digital image correlation, computed tomography

Procedia PDF Downloads 93
501 The Effect of Low Power Laser on CK and Some of Markers Delayed Onset Muscle Soreness (DOMS)

Authors: Bahareh Yazdanparast Chaharmahali

Abstract:

The study showed effect of low power laser therapy on knee range of motion (flexion and extension), resting angle of knee joint, knee circumference and rating of delayed onset muscle soreness induced pain, 24 and 48 hours after eccentric training of knee flexor muscle (hamstring muscle). We investigate the effects of pulsed ultrasound on swelling, relaxed, flexion and extension knee angle and pain. 20 volunteers among girl students of college voluntary participated in this research. After eccentric training, subjects were randomly divided into two groups, control and laser therapy. In day 1 and in order to induce delayed onset muscle soreness, subjects eccentrically trained their knee flexor muscles. In day 2, subjects were randomly divided into two groups: control and low power laser therapy. 24 and 48 hours after eccentric training. Variables (knee flexion and extension, srang of motion, resting knee joint angle and knee circumferences) were measured and analyzed. Data are reported as means ± standard error (SE) and repeated measured was used to assess differences within groups. Methods of treatment (low power laser therapy) have significant effects on delayed onset muscle soreness markers. 24 and 48 hours after training a significant difference was observed between mean pains of 2 groups. This difference was significant between low power laser therapy and C groups. The Bonferroni post hock is significant. Low power laser therapy trophy as used in this study did significantly diminish the effects of delayed – onset muscle soreness on swelling, relaxed – knee extension and flexion angle.

Keywords: creatine kinase, DOMS, eccentric training, low power laser

Procedia PDF Downloads 222
500 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique

Authors: Rafid Doulab

Abstract:

Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.

Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration

Procedia PDF Downloads 94
499 Analysis of the Evolution of Techniques and Review in Cleft Surgery

Authors: Tomaz Oliveira, Rui Medeiros, André Lacerda

Abstract:

Introduction: Cleft lip and/or palate are the most frequent forms of congenital craniofacial anomalies, affecting mainly the middle third of the face and manifesting by functional and aesthetic changes. Bilateral cleft lip represents a reconstructive surgical challenge, not only for the labial component but also for the associated nasal deformation. Recently, the paradigm of the approach to this pathology has changed, placing the focus on muscle reconstruction and anatomical repositioning of the nasal cartilages in order to obtain the best aesthetic and functional results. The aim of this study is to carry out a systematic review of the surgical approach to bilateral cleft lip, retrospectively analyzing the case series of Plastic Surgery Service at Hospital Santa Maria (Lisbon, Portugal) regarding this pathology, the global assessment of the characteristics of the operated patients and the study of the different surgical approaches and their complications in the last 20 years. Methods: The present work demonstrates a retrospective and descriptive study of patients who underwent at least one reconstructive surgery for cleft lip and/or palate, in the CPRE service of the HSM, in the period between January 1 of 1997 and December 31 of 2017, in which the data relating to 361 individuals were analyzed who, after applying the exclusion criteria, constituted a sample of 212 participants. The variables analyzed were the year of the first surgery, gender, age, type of orofacial cleft, surgical approach, and its complications. Results: There was a higher overall prevalence in males, with cleft lip and cleft palate occurring in greater proportion in males, with the cleft palate being more common in females. The most frequently recorded malformation was cleft lip and palate, which is complete in most cases. Regarding laterality, alterations with a unilateral labial component were the most commonly observed, with the left lip being described as the most affected. It was found that the vast majority of patients underwent primary intervention up to 12 months of age. The surgical techniques used in the approach to this pathology showed an important chronological variation over the years. Discussion: Cleft lip and/or palate is a medical condition associated with high aesthetic and functional morbidity, which requires early treatment in order to optimize the long-term outcome. The existence of a nasolabial component and its respective surgical correction plays a central role in the treatment of this pathology. The high rates of post-surgical complications and unconvincing aesthetic results have motivated an evolution of the surgical technique, increasingly evident in recent years, allowing today to achieve satisfactory aesthetic results, even in bilateral cleft lip with high deformation complexity. The introduction of techniques that favor nasolabial reconstruction based on anatomical principles has been producing increasingly convincing results. The analyzed sample shows that most of the results obtained in this study are, in general, compatible with the results published in the literature. Conclusion: This work showed that the existence of small variations in the surgical technique can bring significant improvements in the functional and aesthetic results in the treatment of bilateral cleft lip.

Keywords: cleft lip, palate lip, congenital abnormalities, cranofacial malformations

Procedia PDF Downloads 85
498 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 467
497 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile

Procedia PDF Downloads 138
496 Development of an Systematic Design in Evaluating Force-On-Force Security Exercise at Nuclear Power Plants

Authors: Seungsik Yu, Minho Kang

Abstract:

As the threat of terrorism to nuclear facilities is increasing globally after the attacks of September 11, we are striving to recognize the physical protection system and strengthen the emergency response system. Since 2015, Korea has implemented physical protection security exercise for nuclear facilities. The exercise should be carried out with full cooperation between the operator and response forces. Performance testing of the physical protection system should include appropriate exercises, for example, force-on-force exercises, to determine if the response forces can provide an effective and timely response to prevent sabotage. Significant deficiencies and actions taken should be reported as stipulated by the competent authority. The IAEA(International Atomic Energy Agency) is also preparing force-on-force exercise program documents to support exercise of member states. Currently, ROK(Republic of Korea) is implementing exercise on the force-on-force exercise evaluation system which is developed by itself for the nuclear power plant, and it is necessary to establish the exercise procedure considering the use of the force-on-force exercise evaluation system. The purpose of this study is to establish the work procedures of the three major organizations related to the force-on-force exercise of nuclear power plants in ROK, which conduct exercise using force-on-force exercise evaluation system. The three major organizations are composed of licensee, KINAC (Korea Institute of Nuclear Nonproliferation and Control), and the NSSC(Nuclear Safety and Security Commission). Major activities are as follows. First, the licensee establishes and conducts an exercise plan, and when recommendations are derived from the result of the exercise, it prepares and carries out a force-on-force result report including a plan for implementation of the recommendations. Other detailed tasks include consultation with surrounding units for adversary, interviews with exercise participants, support for document evaluation, and self-training to improve the familiarity of the MILES (Multiple Integrated Laser Engagement System). Second, KINAC establishes a force-on-force exercise plan review report and reviews the force-on-force exercise plan report established by licensee. KINAC evaluate force-on-force exercise using exercise evaluation system and prepare training evaluation report. Other detailed tasks include MILES training, adversary consultation, management of exercise evaluation systems, and analysis of exercise evaluation results. Finally, the NSSC decides whether or not to approve the force-on-force exercise and makes a correction request to the nuclear facility based on the exercise results. The most important part of ROK's force-on-force exercise system is the analysis through the exercise evaluation system implemented by KINAC after the exercise. The analytical method proceeds in the order of collecting data from the exercise evaluation system and analyzing the collected data. The exercise application process of the exercise evaluation system introduced in ROK in 2016 will be concretely set up, and a system will be established to provide objective and consistent conclusions between exercise sessions. Based on the conclusions drawn up, the ultimate goal is to complement the physical protection system of licensee so that the system makes licensee respond effectively and timely against sabotage or unauthorized removal of nuclear materials.

Keywords: Force-on-Force exercise, nuclear power plant, physical protection, sabotage, unauthorized removal

Procedia PDF Downloads 119
495 Hybrid Polymer Microfluidic Platform for Studying Endothelial Cell Response to Micro Mechanical Environment

Authors: Mitesh Rathod, Jungho Ahn, Noo Li Jeon, Junghoon Lee

Abstract:

Endothelial cells respond to cues from both biochemical as well as micro mechanical environment. Significant effort has been directed to understand the effects of biochemical signaling, however, relatively little is known about regulation of endothelial cell biology by the micro mechanical environment. Numerous studies have been performed to understand how physical forces regulate endothelial cell behavior. In this regard, past studies have majorly focused on exploring how fluid shear stress governs endothelial cell behavior. Parallel plate flow chambers and rectangular microchannels are routinely employed for applying fluid shear force on endothelial cells. However, these studies fall short in mimicking the in vivo like micro environment from topological aspects. Few studies have only used circular microchannels to replicate in vivo like condition. Seldom efforts have been directed to elucidate the combined effect of topology, substrate rigidity and fluid shear stress on endothelial cell response. In this regard, we demonstrate a facile fabrication process to develop a hybrid polydimethylsiloxane microfluidic platform to study endothelial cell biology. On a single chip microchannels with different cross sections i.e., circular, rectangular and square have been fabricated. In addition, our fabrication approach allows variation in the substrate rigidity along the channel length. Two different variants of polydimethylsiloxane, namely Sylgard 184 and Sylgard 527, were utilized to achieve the variation in rigidity. Moreover, our approach also enables in creating Y bifurcation circular microchannels. Our microfluidic platform thus facilitates for conducting studies pertaining to endothelial cell morphology with respect to change in topology, substrate rigidity and fluid flow on a single chip. The hybrid platform was tested by culturing Human Umbilical Vein Endothelial Cells in circular microchannels with varying substrate rigidity, and exposed to fluid shear stress of 12 dynes/cm² and static conditions. Results indicate the cell area response to flow induced shear stress was governed by the underlying substrate mechanics.

Keywords: hybrid, microfluidic platform, PDMS, shear flow, substrate rigidity

Procedia PDF Downloads 249
494 Quantum Coherence Sets the Quantum Speed Limit for Mixed States

Authors: Debasis Mondal, Chandan Datta, S. K. Sazim

Abstract:

Quantum coherence is a key resource like entanglement and discord in quantum information theory. Wigner- Yanase skew information, which was shown to be the quantum part of the uncertainty, has recently been projected as an observable measure of quantum coherence. On the other hand, the quantum speed limit has been established as an important notion for developing the ultra-speed quantum computer and communication channel. Here, we show that both of these quantities are related. Thus, cast coherence as a resource to control the speed of quantum communication. In this work, we address three basic and fundamental questions. There have been rigorous attempts to achieve more and tighter evolution time bounds and to generalize them for mixed states. However, we are yet to know (i) what is the ultimate limit of quantum speed? (ii) Can we measure this speed of quantum evolution in the interferometry by measuring a physically realizable quantity? Most of the bounds in the literature are either not measurable in the interference experiments or not tight enough. As a result, cannot be effectively used in the experiments on quantum metrology, quantum thermodynamics, and quantum communication and especially in Unruh effect detection et cetera, where a small fluctuation in a parameter is needed to be detected. Therefore, a search for the tightest yet experimentally realisable bound is a need of the hour. It will be much more interesting if one can relate various properties of the states or operations, such as coherence, asymmetry, dimension, quantum correlations et cetera and QSL. Although, these understandings may help us to control and manipulate the speed of communication, apart from the particular cases like the Josephson junction and multipartite scenario, there has been a little advancement in this direction. Therefore, the third question we ask: (iii) Can we relate such quantities with QSL? In this paper, we address these fundamental questions and show that quantum coherence or asymmetry plays an important role in setting the QSL. An important question in the study of quantum speed limit may be how it behaves under classical mixing and partial elimination of states. This is because this may help us to choose properly a state or evolution operator to control the speed limit. In this paper, we try to address this question and show that the product of the time bound of the evolution and the quantum part of the uncertainty in energy or quantum coherence or asymmetry of the state with respect to the evolution operator decreases under classical mixing and partial elimination of states.

Keywords: completely positive trace preserving maps, quantum coherence, quantum speed limit, Wigner-Yanase Skew information

Procedia PDF Downloads 324
493 Effect of Pioglitazone on Intracellular Na+ Homeostasis in Metabolic Syndrome-Induced Cardiomyopathy in Male Rats

Authors: Ayca Bilginoglu, Belma Turan

Abstract:

Metabolic syndrome, is associated impaired blood glucose level, insulin resistance, dyslipidemia caused by abdominal obesity. Also, it is related with cardiovascular risk accumulation and cardiomyopathy. The hypothesis of this study was to examine the effect of thiazolidinediones such as pioglitazone which is widely used insulin-sensitizing agents that improve glycemic control, on intracellular Na+ homeostasis in metabolic syndrome-induced cardiomyopathy in male rats. Male Wistar-Albino rats were randomly divided into three groups, namely control (Con, n=7), metabolic syndrome (MetS, n=7) and pioglitazone treated metabolic syndrome group (MetS+PGZ, n=7). Metabolic syndrome was induced by providing drinking water that was 32% sucrose, for 18 weeks. All of the animals were exposed to a 12 h light – 12 h dark cycle. Abdominal obesity and glucose intolerance had measured as a marker of metabolic syndrome. Intracellular Na+ ([Na+]i) is an important modulator of excitation–contraction coupling in heart. [Na+]i at rest and [Na+]i during pacing with electrical field stimulation in 0.2 Hz, 0.8 Hz, 2.0 Hz stimulation frequency were recorded in cardiomyocytes. Also, Na+ channel current (INa) density and I-V curve were measured to understand [Na+]i homeostasis. In results, high sucrose intake, as well as the normal daily diet, significantly increased body mass and blood glucose level of the rats in the metabolic syndrome group as compared with the non-treated control group. In MetS+PZG group, the blood glucose level and body inclined to decrease to the Con group. There was a decrease in INa density and there was a shift both activation and inactivation curve of INa. Pioglitazone reversed the shift to the control side. Basal [Na+]i either MetS and Con group were not significantly different, but there was a significantly increase in [Na+]i in stimulated cardiomyocytes in MetS group. Furthermore, pioglitazone had not effect on basal [Na+]i but it reversed the increase in [Na+]i in stimulated cardiomyocytes to the that of Con group. Results of the present study suggest that pioglitazone has a significant effect on the Na+ homeostasis in the metabolic syndrome induced cardiomyopathy in rats. All animal procedures and experiments were approved by the Animal Ethics Committee of Ankara University Faculty of Medicine (2015-2-37).

Keywords: insulin resistance, intracellular sodium, metabolic syndrome, sodium current

Procedia PDF Downloads 261
492 Clinical Cases of Rare Types of 'Maturity Onset Diabetes of the Young' Diabetes

Authors: Alla Ovsyannikova, Oksana Rymar, Elena Shakhtshneider, Mikhail Voevoda

Abstract:

In Siberia endocrinologists increasingly noted young patients with the course of diabetes mellitus differing from 1 and 2 types. Therefore we did a molecular genetic study for this group of patients to verify the monogenic forms of diabetes mellitus in them and researched the characteristics of this pathology. When confirming the monogenic form of diabetes, we performed a correction therapy for many patients (transfer from insulin to tablets), prevented specific complications, examined relatives and diagnosed their diabetes at the preclinical stage, revealed phenotypic characteristics of the pathology which led to the high significance of this work. Materials and Methods: We observed 5 patients (4 families). We diagnosed MODY (Maturity Onset Diabetes of the Young) during the molecular genetic testing (direct automatic sequencing). All patients had a full clinical examination, blood samples for biochemical research, determination of C-peptide and TSH, antibodies to b-cells, microalbuminuria, abdominal ultrasound, heart and thyroid ultrasound, examination of ophthalmologist. Results: We diagnosed 3 rare types of MODY: two women had MODY8, one man – MODY6 and man and his mother - MODY12. Patients with types 8 and 12 had clinical features. Age of onset hyperglycemia ranged from 26 to 34 years. In a patient with MODY6 fasting hyperglycemia was detected during a routine examination. Clinical symptoms, complications were not diagnosed. The patient observes a diet. In the first patient MODY8 was detected during first pregnancy, she had itchy skin and mostly postprandial hyperglycemia. Upon examination we determined glycated hemoglobin 7.5%, retinopathy, non-proliferative stage, peripheral neuropathy. She uses a basic bolus insulin therapy. The second patient with MODY8 also had clinical manifestations of hyperglycemia (pruritus, thirst), postprandial hyperglycemia and diabetic nephropathy, a stage of microalbuminuria. The patient was diagnosed autoimmune thyroiditis. She used inhibitors of DPP-4. The patient with MODY12 had an aggressive course. In the detection of hyperglycemia he had complaints of visual impairment, intense headaches, leg cramps. The patient had a history of childhood convulsive seizures of non-epileptic genesis, without organic pathology, which themselves were stopped at the age of 12 years. When we diagnosed diabetes a patient was 28 years, he had hypertriglyceridemia, atherosclerotic plaque in the carotid artery, proliferative retinopathy (lacerocoagulation). Diabetes and early myocardial infarction were observed in three cases in family. We prescribe therapy with sulfonylureas and SGLT-2 inhibitors with a positive effect. At the patient's mother diabetes began at a later age (30 years) and a less aggressive course was observed. She also has hypertriglyceridemia and uses oral hypoglycemic drugs. Conclusions: 1) When young patients with hyperglycemia have extrapancreatic pathologies and diabetic complications with a short duration of diabetes we can assume they have one of type of MODY diabetes. 2) In patients with monogenic forms of diabetes mellitus, the clinical manifestations of hyperglycemia in each succeeding generation are revealed at an earlier age. Research had increased our knowledge of the monogenic forms of diabetes. The reported study was supported by RSCF, research project No. 14-15-00496-P.

Keywords: diabetes mellitus, MODY diabetes, monogenic forms, young patients

Procedia PDF Downloads 223
491 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model

Authors: Donatella Giuliani

Abstract:

In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.

Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation

Procedia PDF Downloads 193
490 Delineation of Different Geological Interfaces Beneath the Bengal Basin: Spectrum Analysis and 2D Density Modeling of Gravity Data

Authors: Md. Afroz Ansari

Abstract:

The Bengal basin is a spectacular example of a peripheral foreland basin formed by the convergence of the Indian plate beneath the Eurasian and Burmese plates. The basin is embraced on three sides; north, west and east by different fault-controlled tectonic features whereas released in the south where the rivers are drained into the Bay of Bengal. The Bengal basin in the eastern part of the Indian subcontinent constitutes the largest fluvio-deltaic to shallow marine sedimentary basin in the world today. This continental basin coupled with the offshore Bengal Fan under the Bay of Bengal forms the biggest sediment dispersal system. The continental basin is continuously receiving the sediments by the two major rivers Ganga and Brahmaputra (known as Jamuna in Bengal), and Meghna (emerging from the point of conflux of the Ganga and Brahmaputra) and large number of rain-fed, small tributaries originating from the eastern Indian Shield. The drained sediments are ultimately delivered into the Bengal fan. The significance of the present study is to delineate the variations in thicknesses of the sediments, different crustal structures, and the mantle lithosphere throughout the onshore-offshore Bengal basin. In the present study, the different crustal/geological units and the shallower mantle lithosphere were delineated by analyzing the Bouguer Gravity Anomaly (BGA) data along two long traverses South-North (running from Bengal fan cutting across the transition offshore-onshore of the Bengal basin and intersecting the Main Frontal Thrust of India-Himalaya collision zone in Sikkim-Bhutan Himalaya) and West-East (running from the Peninsular Indian Shield across the Bengal basin to the Chittagong–Tripura Fold Belt). The BGA map was derived from the analysis of topex data after incorporating Bouguer correction and all terrain corrections. The anomaly map was compared with the available ground gravity data in the western Bengal basin and the sub-continents of India for consistency of the data used. Initially, the anisotropy associated with the thicknesses of the different crustal units, crustal interfaces and moho boundary was estimated through spectral analysis of the gravity data with varying window size over the study area. The 2D density sections along the traverses were finalized after a number of iterations with the acceptable root mean square (RMS) errors. The estimated thicknesses of the different crustal units and dips of the Moho boundary along both the profiles are consistent with the earlier results. Further the results were encouraged by examining the earthquake database and focal mechanism solutions for better understanding the geodynamics. The earthquake data were taken from the catalogue of US Geological Survey, and the focal mechanism solutions were compiled from the Harvard Centroid Moment Tensor Catalogue. The concentrations of seismic events at different depth levels are not uncommon. The occurrences of earthquakes may be due to stress accumulation as a result of resistance from three sides.

Keywords: anisotropy, interfaces, seismicity, spectrum analysis

Procedia PDF Downloads 246
489 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc

Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez

Abstract:

The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.

Keywords: BLER, LTE, network, qualipoc, SNR.

Procedia PDF Downloads 90