Search results for: abnormal volume
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3035

Search results for: abnormal volume

2735 Sustainable Wood Harvesting from Juniperus procera Trees Managed under a Participatory Forest Management Scheme in Ethiopia

Authors: Mindaye Teshome, Evaldo Muñoz Braz, Carlos M. M. Eleto Torres, Patricia Mattos

Abstract:

Sustainable forest management planning requires up-to-date information on the structure, standing volume, biomass, and growth rate of trees from a given forest. This kind of information is lacking in many forests in Ethiopia. The objective of this study was to quantify the population structure, diameter growth rate, and standing volume of wood from Juniperus procera trees in the Chilimo forest. A total of 163 sample plots were set up in the forest to collect the relevant vegetation data. Growth ring measurements were conducted on stem disc samples collected from 12 J. procera trees. Diameter and height measurements were recorded from a total of 1399 individual trees with dbh ≥ 2 cm. The growth rate, maximum current and mean annual increments, minimum logging diameter, and cutting cycle were estimated, and alternative cutting cycles were established. Using these data, the harvestable volume of wood was projected by alternating four minimum logging diameters and five cutting cycles following the stand table projection method. The results show that J. procera trees have an average density of 183 stems ha⁻¹, a total basal area of 12.1 m² ha⁻¹, and a standing volume of 98.9 m³ ha⁻¹. The mean annual diameter growth ranges between 0.50 and 0.65 cm year⁻¹ with an overall mean of 0.59 cm year⁻¹. The population of J. procera tree followed a reverse J-shape diameter distribution pattern. The maximum current annual increment in volume (CAI) occurred at around 49 years when trees reached 30 cm in diameter. Trees showed the maximum mean annual increment in volume (MAI) around 91 years, with a diameter size of 50 cm. The simulation analysis revealed that 40 cm MLD and a 15-year cutting cycle are the best minimum logging diameter and cutting cycle. This combination showed the largest harvestable volume of wood potential, volume increments, and a 35% recovery of the initially harvested volume. It is concluded that the forest is well stocked and has a large amount of harvestable volume of wood from J. procera trees. This will enable the country to partly meet the national wood demand through domestic wood production. The use of the current population structure and diameter growth data from tree ring analysis enables the exact prediction of the harvestable volume of wood. The developed model supplied an idea about the productivity of the J. procera tree population and enables policymakers to develop specific management criteria for wood harvesting.

Keywords: logging, growth model, cutting cycle, minimum logging diameter

Procedia PDF Downloads 74
2734 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 173
2733 The Effect of Nutrition Education on Glycemic and Lipidemic Control in Iranian Patients with Type 2 Diabetes

Authors: Samira Rabiei, Faezeh Askari, Reza Rastmanesh

Abstract:

Objective: To evaluate the effects of nutrition education and adherence to a healthy diet on glycemic and lipidemic control in patients with T2DM. Material and Methods: A randomized controlled trial was conducted on 494 patients with T2DM, aged 14-87 years from both sexes who were selected by convenience sampling from referees to Aliebneabitaleb hospital in Ghom. The participants were divided into two 247 person groups by stratified randomization. Both groups received a diet adjusted based on ideal body weight, and the intervention group was additionally educated about healthy food choices regarding diabetes. Information on medications, psychological factors, diet and physical activity was obtained from questionnaires. Blood samples were collected to measure FBS, 2 hPG, HbA1c, cholesterol, and triglyceride. After 2 months, weight and biochemical parameters were measured again. Independent T-test, Mann-Whitney, Chi-square, and Wilcoxon were used as appropriate. Logistic regression was used to determine the odds ratio of abnormal glycemic and lipidemic control according to the intervention. Results: The mean weight, FBS, 2 hPG, cholesterol and triglyceride after intervention were significantly lower than before that (p < 0.05). Discussion: Nutrition education plus a weigh reducer diet is more effective on glycemic and lipidemic control than a weight reducer diet, alone.

Keywords: type 2 diabetes mellitus, nutrition education, glycemic control, lipid profile

Procedia PDF Downloads 187
2732 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric

Procedia PDF Downloads 153
2731 Effect of Volume Fraction of Fibre on the Mechanical Properties of Nanoclay Reinforced E-Glass-Epoxy Composites

Authors: K. Krushnamurty, D. Rasmitha, I. Srikanth, K. Ramji, Ch. Subrahmanyam

Abstract:

E-glass-epoxy laminated composites having different fiber volume fractions (40, 50, 60 and 70) were fabricated with and without the addition of nanoclay. Flexural strength and tensile strength of the composite laminates were determined. It was observed that, with increasing the fiber volume fraction (Vf) of fiber from 40 to 60, the ability of nanoclay to enhance the tensile and flexural strength of E-glass-epoxy composites decreases significantly. At 70Vf, the tensile and flexural strength of the nanoclay reinforced E-glass-epoxy were found to be lowest when compared to the E-glass-epoxy composite made without the addition of nanoclay. Based on the obtained data and microstructure of the tested samples, plausible mechanism for the observed trends has been proposed. The enhanced mechanical properties for nanoclay reinforced E-glass-epoxy composites for 40-60 Vf, due to higher interface toughness coupled with strong interfilament bonding may have ensured the homogeneous load distribution across all the glass fibers. Results in the decrease in mechanical properties at 70Vf, may be due to the inability of the matrix to bind the nanoclay and glass-fibers.

Keywords: e-glass-epoxy composite laminates, fiber volume fraction, e-glass fiber, mechanical properties, delamination

Procedia PDF Downloads 324
2730 Automatic Detection of Defects in Ornamental Limestone Using Wavelets

Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas

Abstract:

A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.

Keywords: automatic detection, defects, fracture lines, wavelets

Procedia PDF Downloads 237
2729 Low Volume High Intensity Interval Training Effect on Liver Enzymes in Chronic Hepatitis C Patients

Authors: Aya Gamal Khattab

Abstract:

Chronic infection with the hepatitis C virus (HCV) is now the leading cause of liver-related morbidity and mortality; Currently, alanine aminotransferase ALT measurement is not only widely used in detecting the incidence, development, and prognosis of liver disease with obvious clinical symptoms, but also provides reference on screening the overall health status during health check-ups. Exercise is a low-cost, reliable and sustainable therapy for many chronic diseases. Low-volume high intensity interval training HIT is time efficient while also having wider application to different populations including people at risk for chronic inflammatory diseases. Purpose of this study was to investigate the effect of low volume high intensity interval training on ALT, AST in HCV patients. All practical work was done in outpatient physiotherapy clinic of Suez Canal Authority Hospitals. Forty patients both gender (27 male, 13 female), age ranged (40-60) years old submitted to low volume high intensity interval training on treadmill for two months three sessions per week. Each session consisting of five min warming up, two bouts for 10 min each bout consisting of 30 sec - 1 min of high intensity (75%-85%) HRmax then two to four min active recovery at intensity (40%-60%) HRmax, so the sum of high intensity intervals was one to two min for each session and four to eight min active recovery, and ends with five min cooling down. ALT and AST were measured before starting exercise session and 2 months later after finishing the total exercise sessions through blood samples. Results showed significant decrease in ALT, AST with improvement percentage (18.85%), (23.87%) in the study, so the study concluded that low volume high intensity interval training had a significant effect in lowering the level of circulating liver enzymes (ALT, AST) which means protection of hepatic cells and restoration of its function.

Keywords: alanine aminotransferase (ALT), aspartate aminotransferase (AST), hepatitis C (HCV), low volume high intensity interval training

Procedia PDF Downloads 284
2728 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments

Authors: Skyler Kim

Abstract:

An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.

Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning

Procedia PDF Downloads 175
2727 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences

Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng

Abstract:

Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).

Keywords: motion detection, motion tracking, trajectory analysis, video surveillance

Procedia PDF Downloads 524
2726 Clinical Efficacy of Indigenous Software for Automatic Detection of Stages of Retinopathy of Prematurity (ROP)

Authors: Joshi Manisha, Shivaram, Anand Vinekar, Tanya Susan Mathews, Yeshaswini Nagaraj

Abstract:

Retinopathy of prematurity (ROP) is abnormal blood vessel development in the retina of the eye in a premature infant. The principal object of the invention is to provide a technique for detecting demarcation line and ridge detection for a given ROP image that facilitates early detection of ROP in stage 1 and stage 2. The demarcation line is an indicator of Stage 1 of the ROP and the ridge is the hallmark of typically Stage 2 ROP. Thirty Retcam images of Asian Indian infants obtained during routine ROP screening have been used for the analysis. A graphical user interface has been developed to detect demarcation line/ridge and to extract ground truth. This novel algorithm uses multilevel vessel enhancement to enhance tubular structures in the digital ROP images. It has been observed that the orientation of the demarcation line/ridge is normal to the direction of the blood vessels, which is used for the identification of the ridge/ demarcation line. Quantitative analysis has been presented based on gold standard images marked by expert ophthalmologist. Image based analysis has been based on the length and the position of the detected ridge. In image based evaluation, average sensitivity and positive predictive value was found to be 92.30% and 85.71% respectively. In pixel based evaluation, average sensitivity, specificity, positive predictive value and negative predictive value achieved were 60.38%, 99.66%, 52.77% and 99.75% respectively.

Keywords: ROP, ridge, multilevel vessel enhancement, biomedical

Procedia PDF Downloads 389
2725 Single-Cell Visualization with Minimum Volume Embedding

Authors: Zhenqiu Liu

Abstract:

Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.

Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method

Procedia PDF Downloads 214
2724 Phenotypic and Genotypic Diagnosis of Gaucher Disease in Algeria

Authors: S. Hallal, Z. Chami, A. Hadji-Lehtihet, S. Sokhal-Boudella, A. Berhoune, L. Yargui

Abstract:

Gaucher disease is the most common lysosomal storage in our population, it is due to a deficiency of β –glucosidase acid. The enzyme deficiency causes a pathological accumulation of undegraded substrate in lysosomes. This metabolic overload is responsible for a multisystemic disease with hepatosplenomegaly, anemia, thrombocytopenia, and bone involvement. Neurological involvement is rare. The laboratory diagnosis of Gaucher disease consists of phenotypic diagnosis by determining the enzymatic activity of β - glucosidase by fluorimetric method, a study by genotypic diagnosis in the GBA gene, limiting the search recurrent mutations (N370S, L444P, 84 GG); PCR followed by an enzymatic digestion. Abnormal profiles were verified by sequencing. Monitoring of treated patients is provided by the determination of chitotriosidase. Our experience spaning a period of 6 years (2007-2014) has enabled us to diagnose 78 patients out of a total of 328 requests from the various departments of pediatrics, internal medicine, neurology. Genotypic diagnosis focused on the entire family of 9 children treated at pediatric CHU Mustapha, which help define the clinical form; or 5 of them had type III disease, carrying the L444P mutation in the homozygous state. Three others were composite (N370/L444P) (N370S/other unintended mutation in our study), and only in one family no recurrent mutation has been found. This molecular study permits screening of heterozygous essential for genetic counseling.

Keywords: Gaucher disease, mutations, N370S, L444P

Procedia PDF Downloads 388
2723 Design of a Photovoltaic Power Generation System Based on Artificial Intelligence and Internet of Things

Authors: Wei Hu, Wenguang Chen, Chong Dong

Abstract:

In order to improve the efficiency and safety of photovoltaic power generation devices, this photovoltaic power generation system combines Artificial Intelligence (AI) and the Internet of Things (IoT) to control the chasing photovoltaic power generation device to track the sun to improve power generation efficiency and then convert energy management. The system uses artificial intelligence as the control terminal, the power generation device executive end uses the Linux system, and Exynos4412 is the CPU. The power generating device collects the sun image information through Sony CCD. After several power generating devices feedback the data to the CPU for processing, several CPUs send the data to the artificial intelligence control terminal through the Internet. The control terminal integrates the executive terminal information, time information, and environmental information to decide whether to generate electricity normally and then whether to convert the converted electrical energy into the grid or store it in the battery pack. When the power generation environment is abnormal, the control terminal authorizes the protection strategy, the power generation device executive terminal stops power generation and enters a self-protection posture, and at the same time, the control terminal synchronizes the data with the cloud. At the same time, the system is more intelligent, more adaptive, and longer life.

Keywords: photo-voltaic power generation, the pursuit of light, artificial intelligence, internet of things, photovoltaic array, power management

Procedia PDF Downloads 108
2722 Reaching the Goals of Routine HIV Screening Programs: Quantifying and Implementing an Effective HIV Screening System in Northern Nigeria Facilities Based on Optimal Volume Analysis

Authors: Folajinmi Oluwasina, Towolawi Adetayo, Kate Ssamula, Penninah Iutung, Daniel Reijer

Abstract:

Objective: Routine HIV screening has been promoted as an essential component of efforts to reduce incidence, morbidity, and mortality. The objectives of this study were to identify the optimal annual volume needed to realize the public health goals of HIV screening in the AIDS Healthcare Foundation supported hospitals and establish an implementation process to realize that optimal annual volume. Methods: Starting in 2011 a program was established to routinize HIV screening within communities and government hospitals. In 2016 Five-years of HIV screening data were reviewed to identify the optimal annual proportions of age-eligible patients screened to realize the public health goals of reducing new diagnoses and ending late-stage diagnosis (tracked as concurrent HIV/AIDS diagnosis). Analysis demonstrated that rates of new diagnoses level off when 42% of age-eligible patients were screened, providing a baseline for routine screening efforts; and concurrent HIV/AIDS diagnoses reached statistical zero at screening rates of 70%. Annual facility based targets were re-structured to meet these new target volumes. Restructuring efforts focused on right-sizing HIV screening programs to align and transition programs to integrated HIV screening within standard medical care and treatment. Results: Over one million patients were screened for HIV during the five years; 16, 033 new HIV diagnoses and access to care and treatment made successfully for 82 % (13,206), and concurrent diagnosis rates went from 32.26% to 25.27%. While screening rates increased by 104.7% over the 5-years, volume analysis demonstrated that rates need to further increase by 62.52% to reach desired 20% baseline and more than double to reach optimal annual screening volume. In 2011 facility targets for HIV screening were increased to reflect volume analysis, and in that third year, 12 of the 19 facilities reached or exceeded new baseline targets. Conclusions and Recommendation: Quantifying targets against routine HIV screening goals identified optimal annual screening volume and allowed facilities to scale their program size and allocate resources accordingly. The program transitioned from utilizing non-evidence based annual volume increases to establishing annual targets based on optimal volume analysis. This has allowed efforts to be evaluated on the ability to realize quantified goals related to the public health value of HIV screening. Optimal volume analysis helps to determine the size of an HIV screening program. It is a public health tool, not a tool to determine if an individual patient should receive screening.

Keywords: HIV screening, optimal volume, HIV diagnosis, routine

Procedia PDF Downloads 242
2721 Combination Approach Using Experiments and Optimal Experimental Design to Optimize Chemical Concentration in Alkali-Surfactant-Polymer Process

Authors: H. Tai Pham, Bae Wisup, Sungmin Jung, Ivan Efriza, Ratna Widyaningsih, Byung Un Min

Abstract:

The middle-phase-microemulsion in Alkaline-Surfactant-Polymer (ASP) solution and oil play important roles in the success of an ASP flooding process. The high quality microemulsion phase has ultralow interfacial tensions and it can increase oil recovery. The research used optimal experimental design and response-surface-methodology to predict the optimum concentration of chemicals in ASP solution for maximum microemulsion quality. Secondly, this optimal ASP formulation was implemented in core flooding test to investigate the effective injection volume. As the results, the optimum concentration of surfactants in the ASP solution is 0.57 wt.% and the highest effective injection volume is 19.33% pore volume.

Keywords: optimize, ASP, response surface methodology, solubilization ratio

Procedia PDF Downloads 330
2720 Study of Laminar Convective Heat Transfer, Friction Factor, and Pumping Power Advantage of Aluminum Oxide-Water Nanofluid through a Channel

Authors: M. Insiat Islam Rabby, M. Mahbubur Rahman, Eshanul Islam, A. K. M. Sadrul Islam

Abstract:

The numerical and simulative analysis of laminar heat exchange convection of aluminum oxide (Al₂O₃) - water nanofluid for the developed region through two parallel plates is presented in this present work. The second order single phase energy equation, mass and momentum equation are solved by using finite volume method with the ANSYS FLUENT 16 software. The distance between two parallel plates is 4 mm and length is 600 mm. Aluminum oxide (Al₂O₃) is used as nanoparticle and water is used as the base/working fluid for the investigation. At the time of simulation 1% to 5% volume concentrations of the Al₂O₃ nanoparticles are used for mixing with water to produce nanofluid and a wide range of interval of Reynolds number from 500 to 1100 at constant heat flux 500 W/m² at the channel wall has also been introduced. The result reveals that for increasing the Reynolds number the Nusselt number and heat transfer coefficient are increased linearly and friction factor decreased linearly in the developed region for both water and Al₂O₃-H₂O nanofluid. By increasing the volume fraction of Al₂O₃-H₂O nanofluid from 1% to 5% the value of Nusselt number increased rapidly from 0.7 to 7.32%, heat transfer coefficient increased 7.14% to 31.5% and friction factor increased very little from 0.1% to 4% for constant Reynolds number compared to pure water. At constant heat transfer coefficient 700 W/m2-K the pumping power advantages have been achieved 20% for 1% volume concentration and 62% for 3% volume concentration of nanofluid compared to pure water.

Keywords: convective heat transfer, pumping power, constant heat flux, nanofluid, nanoparticles, volume concentration, thermal conductivity

Procedia PDF Downloads 146
2719 Lennox-gastaut Syndrome Associated with Dysgenesis of Corpus Callosum

Authors: A. Bruce Janati, Muhammad Umair Khan, Naif Alghassab, Ibrahim Alzeir, Assem Mahmoud, M. Sammour

Abstract:

Rationale: Lennox-Gastaut syndrome(LGS) is an electro-clinical syndrome composed of the triad of mental retardation, multiple seizure types, and the characteristic generalized slow spike-wave complexes in the EEG. In this article, we report on two patients with LGS whose brain MRI showed dysgenesis of corpus callosum(CC). We review the literature and stress the role of CC in the genesis of secondary bilateral synchrony(SBS). Method: This was a clinical study conducted at King Khalid Hospital. Results: The EEG was consistent with LGS in patient 1 and unilateral slow spike-wave complexes in patient 2. The MRI showed hypoplasia of the splenium of CC in patient 1, and global hypoplasia of CC combined with Joubert syndrome in patient 2. Conclusion: Based on the data, we proffer the following hypotheses: 1-Hypoplasia of CC interferes with functional integrity of this structure. 2-The genu of CC plays a pivotal role in the genesis of secondary bilateral synchrony. 3-Electrodecremental seizures in LGS emanate from pacemakers generated in the brain stem, in particular the mesencephalon projecting abnormal signals to the cortex via thalamic nuclei. 4-Unilateral slow spike-wave complexes in the context of mental retardation and multiple seizure types may represent a variant of LGS, justifying neuroimaging studies.

Keywords: EEG, Lennox-Gastaut syndrome, corpus callosum , MRI

Procedia PDF Downloads 428
2718 Effect of Friction Parameters on the Residual Bagging Behaviors of Denim Fabrics

Authors: M. Gazzah, B. Jaouachi, F. Sakli

Abstract:

This research focuses on the yarn-to-yarn and metal-to-fabric friction effects on the residual bagging behavior expressed by residual bagging height, volume and recovery of some denim fabrics. The results show, that both residual bagging height and residual bagging volume, which is determined using image analysis method, are significantly affected due to the most influential fabric parameter variations, the weft yarns density and the mean frictional coefficients. After the applied number of fatigue cycles, the findings revealed that the weft yarn rigidity contributes on fabric bagging behavior accurately. Among the tested samples, our results show that the elastic fabrics present a high recovery ability to give low bagging height and volume values.

Keywords: bagging recovery, denim fabric, metal-to-fabric friction, residual bagging height, yarn-to-yarn friction

Procedia PDF Downloads 564
2717 Exploring the Issue of Occult Hypoperfusion in the Pre-Hospital Setting

Authors: A. Fordham, A. Hudson

Abstract:

Background: Studies have suggested 16-25% of normotensive trauma patients with no clinical signs of shock have abnormal lactate and BD readings evidencing shock; a phenomenon known as occult hypoperfusion (OH). In light of the scarce quantity of evidence currently documenting OH, this study aimed to identify the prevalence of OH in the pre-hospital setting and explore ways to improve its identification and management. Methods: A quantitative retrospective data analysis was carried out on 75 sets of patient records for trauma patients treated by Kent Surrey Sussex Air Ambulance Trust between November 2013 and October 2014. The KSS HEMS notes and subsequent ED notes were collected. Trends between patients’ SBP on the scene, whether or not they received PRBCs on the scene as well as lactate and BD readings in the ED were assessed. Patients’ KSS HEMS notes written by the HEMS crew were also reviewed and recorded. Results: -Suspected OH was identified in 7% of the patients who did not receive PRBCs in the pre-hospital phase. -SBP heavily influences the physicians’ decision of whether or not to transfuse PRBCs in the pre-hospital phase. Preliminary conclusions: OH is an under-studied and underestimated phenomenon. We suggest a prospective trial is carried out to evaluate whether detecting trauma patients’ tissue perfusion status in the pre-hospital phase using portable devices capable of measuring serum BD and/or lactate could aid more accurate detection and management of all haemorrhaging trauma patients, including patients with OH.

Keywords: occult hypoperfusion, PRBC transfusion, point of care testing, pre-hospital emergency medicine, trauma

Procedia PDF Downloads 348
2716 Upregulation of CD40/CD40L System in Rheumatic Mitral Stenosis With or Without Atrial Fibrillation

Authors: Azzam H., Abousamra N. K., Wafa A. A., Hafez M. M., El-Gilany A. H.

Abstract:

Platelet activation occurs in peripheral blood of patients with rheumatic mitral stenosis (MS) and atrial fibrillation (AF) and could be related to abnormal thrombogenesis. The CD40/CD40 ligand (CD40L) which reflects platelet activation, mediate a central role in thrombotic diseases. However, the role of CD40/CD40L system in rheumatic MS with or without AF remains unclear. Expressions of CD40 on monocytes and CD40L on platelets were determined by whole blood flow cytometry and serum levels of soluble CD40L were measured by enzyme-linked immunosorbent assay in group 1 (19 patients with MS) and group 2 (20 patients with MS and AF) compared to group 3 (10 controls). Patients with groups 1 and 2 had a significant increase in expression of CD40 on monocytes (P1 and P2 = 0.000) and serum levels of sCD40L (P1 = 0.014 and P2 = 0.033, respectively), but nonsignificant increase in expression of CD40L on platelets (P1 = 0.109 and P2 = 0.060, respectively) as compared to controls. There were no significant difference in all the parameters in group 1 compared to group 2. Correlation analysis demonstrated that there was a significant direct relationship between the severity of MS and serum levels of sCD40L (r = -0.469, p = 0.043). In conclusion, rheumatic MS patients with or without AF had upregulation of the CD40/CD40L system as well as elevated sCD40L levels. The levels of sCD40L had a significantly direct relationship with the severity of MS and it was the stenotic mitral valve, not AF, that had a significant impact on platelet activation.

Keywords: CD40, CD40L, mitral stenosis, atrial fibrillation

Procedia PDF Downloads 77
2715 Gallbladder Amyloidosis Causing Gangrenous Cholecystitis: A Case Report

Authors: Christopher Leung, Guillermo Becerril-Martinez

Abstract:

Amyloidosis is a rare systemic disease where abnormal proteins invade various organs and impede their function. Occasionally, they can manifest in a solidary organ such as the heart, lung, and nervous systems; rarely do they manifest in the gallbladder. Diagnosis often requires biopsy of the affected area and histopathology shows deposition of abnormally folded globular proteins called amyloid proteins. This case presents a 69-year-old male with a 3-month history of RUQ pain, diarrhea and non-specific symptoms of tiredness, etc. On imaging, both his US and CT abdomen showed gallbladder wall thickening and pericholecystic fluid, which may represent acute cholecystitis with hypodense lesions around the gallbladder, possibly representing liver abscesses. Given his symptoms of abdominal pain and imaging findings, this gentleman eventually had a laparoscopic cholecystectomy showing a gangrenous gallbladder with a mass on the liver bed. On histopathology, it showed amorphous hyaline eosinophilic material, which Congo-stained confirmed amyloidosis. Amyloidosis explained his non-specific symptoms, he avoided further biopsy, and he was commenced immediately on Lenalidomide. Involvement of the gallbladder is extremely rare, with less than 30 cases around the world. Half of the cases are reported as primary amyloidosis. This case adds to the current literature regarding primary gallbladder amyloidosis. Importantly, this case highlights how laparoscopic cholecystectomy can help with the diagnosis of gallbladder amyloidosis.

Keywords: amyloidosis, cholecystitis, gangrenous cholecystitis, gallbladder, systemic amyloidosis

Procedia PDF Downloads 182
2714 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images

Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge

Abstract:

Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.

Keywords: band selection, fuzzy c-means, k-means, hyperspectral image

Procedia PDF Downloads 380
2713 Analysis of Ionosphere Anomaly Before Great Earthquake in Java on 2009 Using GPS Tec Data

Authors: Aldilla Damayanti Purnama Ratri, Hendri Subakti, Buldan Muslim

Abstract:

Ionosphere’s anomalies as an effect of earthquake activity is a phenomenon that is now being studied in seismo-ionospheric coupling. Generally, variation in the ionosphere caused by earthquake activity is weaker than the interference generated by different source, such as geomagnetic storms. However, disturbances of geomagnetic storms show a more global behavior, while the seismo-ionospheric anomalies occur only locally in the area which is largely determined by magnitude of the earthquake. It show that the earthquake activity is unique and because of its uniqueness it has been much research done thus expected to give clues as early warning before earthquake. One of the research that has been developed at this time is the approach of seismo-ionospheric-coupling. This study related the state in the lithosphere-atmosphere and ionosphere before and when earthquake occur. This paper choose the total electron content in a vertical (VTEC) in the ionosphere as a parameter. Total Electron Content (TEC) is defined as the amount of electron in vertical column (cylinder) with cross-section of 1m2 along GPS signal trajectory in ionosphere at around 350 km of height. Based on the analysis of data obtained from the LAPAN agency to identify abnormal signals by statistical methods, obtained that there are an anomaly in the ionosphere is characterized by decreasing of electron content of the ionosphere at 1 TECU before the earthquake occurred. Decreasing of VTEC is not associated with magnetic storm that is indicated as an earthquake precursor. This is supported by the Dst index showed no magnetic interference.

Keywords: earthquake, DST Index, ionosphere, seismoionospheric coupling, VTEC

Procedia PDF Downloads 572
2712 Refractive Index, Excess Molar Volume and Viscometric Study of Binary Liquid Mixture of Morpholine with Cumene at 298.15 K, 303.15 K, and 308.15 K

Authors: B. K. Gill, Himani Sharma, V. K. Rattan

Abstract:

Experimental data of refractive index, excess molar volume and viscosity of binary mixture of morpholine with cumene over the whole composition range at 298.15 K, 303.15 K, 308.15 K and normal atmospheric pressure have been measured. The experimental data were used to compute the density, deviation in molar refraction, deviation in viscosity and excess Gibbs free energy of activation as a function of composition. The experimental viscosity data have been correlated with empirical equations like Grunberg- Nissan, Herric correlation and three body McAllister’s equation. The excess thermodynamic properties were fitted to Redlich-Kister polynomial equation. The variation of these properties with composition and temperature of the binary mixtures are discussed in terms of intermolecular interactions.

Keywords: cumene, excess Gibbs free energy, excess molar volume, morpholine

Procedia PDF Downloads 313
2711 Performance Comparison of Resource Allocation without Feedback in Wireless Body Area Networks by Various Pseudo Orthogonal Sequences

Authors: Ojin Kwon, Yong-Jin Yoon, Liu Xin, Zhang Hongbao

Abstract:

Wireless Body Area Network (WBAN) is a short-range wireless communication around human body for various applications such as wearable devices, entertainment, military, and especially medical devices. WBAN attracts the attention of continuous health monitoring system including diagnostic procedure, early detection of abnormal conditions, and prevention of emergency situations. Compared to cellular network, WBAN system is more difficult to control inter- and inner-cell interference due to the limited power, limited calculation capability, mobility of patient, and non-cooperation among WBANs. In this paper, we compare the performance of resource allocation scheme based on several Pseudo Orthogonal Codewords (POCs) to mitigate inter-WBAN interference. Previously, the POCs are widely exploited for a protocol sequence and optical orthogonal code. Each POCs have different properties of auto- and cross-correlation and spectral efficiency according to its construction of POCs. To identify different WBANs, several different pseudo orthogonal patterns based on POCs exploits for resource allocation of WBANs. By simulating these pseudo orthogonal resource allocations of WBANs on MATLAB, we obtain the performance of WBANs according to different POCs and can analyze and evaluate the suitability of POCs for the resource allocation in the WBANs system.

Keywords: wireless body area network, body sensor network, resource allocation without feedback, interference mitigation, pseudo orthogonal pattern

Procedia PDF Downloads 335
2710 Functional Profiling of a Circular RNA from the Huntingtin (HTT) Gene

Authors: Laura Gantley, Vanessa M. Conn, Stuart Webb, Kirsty Kirk, Marta Gabryelska, Duncan Holds, Brett W. Stringer, Simon J. Conn

Abstract:

Trinucleotide repeat disorders comprise ~20 severe, inherited human neuromuscular and neurodegenerative disorders, which are a result of an abnormal expansion of repetitive sequences in the DNA. The most common of these, Huntington’s disease, results from the expansion of the CAG repeat region in exon 1 of the HTT gene via an unknown mechanism. Non-coding RNAs have been implicated in the initiation and progression of many diseases; thus, we focus on one circular RNA (circRNA) molecule arising from non-canonical splicing (back splicing) of HTT pre-mRNA. This circRNA and its mouse orthologue were transgenically overexpressed in human cells (SHSY-5Y and HEK293T) and mouse cells (Mb1), respectively. High-content imaging and flow cytometry demonstrated the overexpression of this circRNA reduces cell proliferation, reduces nuclear size independent of cellular size, and alters cell cycle progression. Analysis of protein by western blot and immunofluorescence demonstrated no change to HTT protein levels but altered nuclear-cytoplasmic distribution without impacting the expansion of the HTT repeat region. As these phenotypic and genotypic changes are found in Huntington’s disease patients, these results may suggest that this circRNA may play a functional role in the progression of Huntington’s disease.

Keywords: cell biology, circular RNAs, Huntington’s disease, molecular biology, neurodegenerative disorders

Procedia PDF Downloads 83
2709 Porosity and Ultraviolet Protection Ability of Woven Fabrics

Authors: Polona Dobnik Dubrovski, Abhijit Majumdar

Abstract:

The increasing awareness of negative effects of ultraviolet radiation and regular, effective protection are actual themes in many countries. Woven fabrics as clothing items can provide convenient personal protection however not all fabrics offer sufficient UV protection. Porous structure of the material has a great effect on UPF. The paper is focused on an overview of porosity in woven fabrics, including the determination of porosity parameters on the basis of an ideal geometrical model of porous structure. Our experiment was focused on 100% cotton woven fabrics in a grey state with the same yarn fineness (14 tex) and different thread densities (to achieve relative fabric density between 59 % and 87 %) and different type of weaves (plain, 4-end twill, 5-end satin). The results of the research dealing with the modelling of UPF and the influence of volume and open porosity of tested samples on UPF are exposed. The results show that open porosity should be lower than 12 % to achieve good UV protection according to AS/NZ standard of tested samples. The results also indicate that there is no direct correlation between volume porosity and UPF, moreover, volume porosity namely depends on the type of weave and affects UPF as well. Plain fabrics did not offer any UV protection, while twill and satin fabrics offered good UV protection when volume porosity was less than 64 % and 66 %, respectively.

Keywords: fabric engineering, UV radiation, porous materials, woven fabric construction, modelling

Procedia PDF Downloads 251
2708 Correlation between Clinical Measurements of Static Foot Posture in Young Adults

Authors: Phornchanok Motantasut, Torkamol Hunsawong, Lugkana Mato, Wanida Donpunha

Abstract:

Identifying abnormal foot posture is important for prescribing appropriate management in patients with lower limb disorders and chronic non-specific low back pain. The normalized navicular height truncated (NNHt) and the foot posture index-6 (FPI-6) have been recommended as the common, simple, valid, and reliable static measures for clinical application. The NNHt is a single plane measure while the FPI-6 is a triple plane measure. At present, there is inadequate information about the correlation between the NNHt and the FPI-6 for categorizing foot posture that leads to a difficulty of choosing the appropriate assessment. Therefore, the present study aimed to determine the correlation between the NNHt and the FPI-6 measures in adult participants with asymptomatic feet. Methods: A cross-sectional descriptive study was conducted in 47 asymptomatic individuals (23 males and 24 females) aged 28.89 ± 7.67 years with body mass index 21.73 ± 1.76 kg/m². The right foot was measured twice by the experienced rater using the NNHt and the FPI-6. A sequence of the measures was randomly arranged for each participant with a 10-minute rest between the tests. The Pearson’s correlation coefficient (r) was used to determine the relationship between the measures. Results: The mean NNHt score was 0.23 ± 0.04 (ranged from 0.15 to 0.36) and the mean FPI-6 score was 4.42 ± 4.36 (ranged from -6 to +11). The Pearson’s correlation coefficient among the NNHt score and the FPI-6 score was -0.872 (p < 0.01). Conclusion: The present finding demonstrates the strong correlation between the NNHt and FPI-6 in adult feet and implies that both measures could be substituted for each other in identifying foot posture.

Keywords: foot posture index, foot type, measurement of foot posture, navicular height

Procedia PDF Downloads 119
2707 The Efficacy of Clobazam for Landau-Kleffner Syndrome

Authors: Nino Gogatishvili, Davit Kvernadze, Giorgi Japharidze

Abstract:

Background and aims: Landau Kleffner syndrome (LKS) is a rare disorder with epileptic seizures and acquired aphasia. It usually starts in initially healthy children. The first symptoms are language regression and behavioral disturbances, and the sleep EEG reveals abnormal epileptiform activity. The aim was to discuss the efficacy of Clobazam for Landau Kleffner syndrome. Case report: We report a case of an 11-year-old boy with an uneventful pregnancy and delivery. He began to walk at 11 months and speak with simple phrases at the age of 2,5 years. At the age of 18 months, he had febrile convulsions; at the age of 5 years, the parents noticed language regression, stuttering, and serious behavioral dysfunction, including hyperactivity, temper outbursts. The epileptic seizure was not noticed. MRI was without any abnormality. Neuropsychological testing revealed verbal auditory agnosia. Sleep EEG showed abundant left fronto-temporal spikes, reaching over 85% during non-rapid eye movement sleep (non-REM sleep). Treatment was started with Clobazam. After ten weeks, EEG was improved. Stuttering and behavior also improved. Results: Since the start of Clobazam treatment, stuttering and behavior improved. Now, he is 11 years old, without antiseizure medication. Sleep EEG shows fronto-temporal spikes on the left side, over 10-49 % of non-REM sleep, bioccipital spikes, and slow-wave discharges and spike-waves. Conclusions: This case provides further support for the efficacy of Clobazam in patients with LKS.

Keywords: Landau-Kleffner syndrome, antiseizure medication, stuttering, aphasia

Procedia PDF Downloads 53
2706 Grain Growth Behavior of High Carbon Microalloyed Steels Containing Very Low Amounts of Niobium

Authors: Huseyin Zengin, Muhammet Emre Turan, Yunus Turen, Hayrettin Ahlatci, Yavuz Sun

Abstract:

This study aimed for understanding the effects of dilute Nb additions on the austenite microstructure of microalloyed steels at five different reheating temperatures from 950 °C to 1300 °C. Four microalloyed high-carbon steels having 0.8 %wt C were examined in which three of them had varying Nb concentrations from 0.005 wt% to 0.02 wt% and one of them had no Nb concentration. The quantitative metallographic techniques were used to measure the average prior austenite grain size in order to compare the grain growth pinning effects of Nb precipitates as a function of reheating temperature. Due to the higher stability of the precipitates with increasing Nb concentrations, the grain coarsening temperature that resulted in inefficient grain growth impediment and a bimodal grain distribution in the microstructure, showed an increase with increasing Nb concentration. The respective grain coarsening temperatures (T_GC) in an ascending order for the steels having 0.005 wt% Nb, 0.01 wt% Nb and 0.02 wt% Nb were 950 °C, 1050 °C and 1150 °C. According to these observed grain coarsening temperatures, an approximation was made considering the complete dissolution temperature (T_DISS) of second phase particles as T_GC=T_DISS-300. On the other hand, the plain carbon steel did not show abnormal grain growth behaviour due to the absence of second phase particles. It was also observed that the higher the Nb concentration, the smaller the average prior austenite grain size although the small increments in Nb concenration did not change the average grain size considerably.

Keywords: microalloyed steels, prior austenite grains, second phase particles, grain coarsening temperature

Procedia PDF Downloads 247