Search results for: artificial artery
1395 Artificial Neural Network Modeling of a Closed Loop Pulsating Heat Pipe
Authors: Vipul M. Patel, Hemantkumar B. Mehta
Abstract:
Technological innovations in electronic world demand novel, compact, simple in design, less costly and effective heat transfer devices. Closed Loop Pulsating Heat Pipe (CLPHP) is a passive phase change heat transfer device and has potential to transfer heat quickly and efficiently from source to sink. Thermal performance of a CLPHP is governed by various parameters such as number of U-turns, orientations, input heat, working fluids and filling ratio. The present paper is an attempt to predict the thermal performance of a CLPHP using Artificial Neural Network (ANN). Filling ratio and heat input are considered as input parameters while thermal resistance is set as target parameter. Types of neural networks considered in the present paper are radial basis, generalized regression, linear layer, cascade forward back propagation, feed forward back propagation; feed forward distributed time delay, layer recurrent and Elman back propagation. Linear, logistic sigmoid, tangent sigmoid and Radial Basis Gaussian Function are used as transfer functions. Prediction accuracy is measured based on the experimental data reported by the researchers in open literature as a function of Mean Absolute Relative Deviation (MARD). The prediction of a generalized regression ANN model with spread constant of 4.8 is found in agreement with the experimental data for MARD in the range of ±1.81%.Keywords: ANN models, CLPHP, filling ratio, generalized regression, spread constant
Procedia PDF Downloads 2921394 Arginase Activity and Nitric Oxide Levels in Patients Undergoing Open Heart Surgery with Cardiopulmonary Bypass
Authors: Mehmet Ali Kisaçam, P. Sema Temizer Ozan, Ayşe Doğan, Gonca Ozan, F. Sarper Türker
Abstract:
Cardiovascular disease which is one of the most common health problems worldwide has crucial importance because of its’ morbidity and mortality rates. Nitric oxide synthase and arginase use L-arginine as a substrate and produce nitric oxide (NO), citrulline and urea, ornithine respectively. Endothelial dysfunction is characterized by reduced bioavailability of vasodilator and anti-inflammatory molecule NO. The purpose of the study to assess endothelial function via arginase activity and NO levels in patients undergoing coronary artery bypass grafting (CABG) surgery. The study was conducted on 26 patients (14 male, 12 female) undergoing CABG surgery. Blood samples were collected from the subjects before surgery, after the termination and after 24 hours of the surgery. Arginase activity and NO levels measured in collected samples spectrophotometrically. Arginase activity decreased significantly in subjects after the termination of the surgery compared to before surgery data. 24 hours after the surgery there wasn’t any significance in arginase activity as it compared to before surgery and after the termination of the surgery. On the other hand, NO levels increased significantly in the subject after the termination of the surgery. However there was no significant increase in NO levels after 24 hours of the surgery, but there was an insignificant increase compared to before surgery data. The results indicate that after the termination of the surgery vascular and endothelial function improved and after 24 hours of the surgery arginase activity and NO levels returned to normal.Keywords: arginase, bypass, cordiopulmonary, nitric oxide
Procedia PDF Downloads 2051393 Geometrical Fluid Model for Blood Rheology and Pulsatile Flow in Stenosed Arteries
Authors: Karan Kamboj, Vikramjeet Singh, Vinod Kumar
Abstract:
Considering blood to be a non-Newtonian Carreau liquid, this indirect numerical model investigates the pulsatile blood flow in a constricted restricted conduit that has numerous gentle stenosis inside the view of an increasing body speed. Asymptotic answers are obtained for the flow rate, pressure inclination, speed profile, sheer divider pressure, and longitudinal impedance to stream after the use of the twofold irritation approach to the problem of the succeeding non-straight limit esteem. It has been observed that the speed of the blood increases when there is an increase in the point of tightening of the conduit, the body speed increase, and the power regulation file. However, this rheological manner of behaving changes to one of longitudinal impedance to stream and divider sheer pressure when each of the previously mentioned boundaries increases. It has also been seen that the sheer divider pressure in the bloodstream greatly increases when there is an increase in the maximum depth of the stenosis but that it significantly decreases when there is an increase in the pulsatile Reynolds number. This is an interesting phenomenon. The assessments of the amount of growth in the longitudinal resistance to flow increase overall with the increment of the maximum depth of the stenosis and the Weissenberg number. Additionally, it is noted that the average speed of blood increases noticeably with the growth of the point of tightening of the corridor, and body speed increases border. This is something that can be observed.Keywords: geometry of artery, pulsatile blood flow, numerous stenosis
Procedia PDF Downloads 991392 Atherosclerotic Plagues and Immune Microenvironment: From Lipid-Lowering to Anti-inflammatory and Immunomodulatory Drug Approaches in Cardiovascular Diseases
Authors: Husham Bayazed
Abstract:
A growing number of studies indicate that atherosclerotic coronary artery disease (CAD) has a complex pathogenesis that extends beyond cholesterol intimal infiltration. The atherosclerosis process may involve an immune micro-environmental condition driven by local activation of the adaptive and innate immunity arrays, resulting in the formation of atherosclerotic plaques. Therefore, despite the wide usage of lipid-lowering agents, these devastating coronary diseases are not averted either at primary or secondary prevention levels. Many trials have recently shown an interest in the immune targeting of the inflammatory process of atherosclerotic plaques, with the promised improvement in atherosclerotic cardiovascular disease outcomes. This recently includes the immune-modulatory drug “Canakinumab” as an anti-interleukin-1 beta monoclonal antibody in addition to "Colchicine,” which's established as a broad-effect drug in the management of other inflammatory conditions. Recent trials and studies highlight the importance of inflammation and immune reactions in the pathogenesis of atherosclerosis and plaque formation. This provides an insight to discuss and extend the therapies from old lipid-lowering drugs (statins) to anti-inflammatory drugs (colchicine) and new targeted immune-modulatory therapies like inhibitors of IL-1 beta (canakinumab) currently under investigation.Keywords: atherosclerotic plagues, immune microenvironment, lipid-lowering agents, and immunomodulatory drugs
Procedia PDF Downloads 691391 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 1521390 Rights-Based Approach to Artificial Intelligence Design: Addressing Harm through Participatory ex ante Impact Assessment
Authors: Vanja Skoric
Abstract:
The paper examines whether the impacts of artificial intelligence (AI) can be meaningfully addressed through the rights-based approach to AI design, investigating in particular how the inclusive, participatory process of assessing the AI impact would make this viable. There is a significant gap between envisioning rights-based AI systems and their practical application. Plausibly, internalizing human rights approach within AI design process might be achieved through identifying and assessing implications of AI features human rights, especially considering the case of vulnerable individuals and communities. However, there is no clarity or consensus on how such an instrument should be operationalised to usefully identify the impact, mitigate harms and meaningfully ensure relevant stakeholders’ participation. In practice, ensuring the meaningful inclusion of those individuals, groups, or entire communities who are affected by the use of the AI system is a prerequisite for a process seeking to assess human rights impacts and risks. Engagement in the entire process of the impact assessment should enable those affected and interested to access information and better understand the technology, product, or service and resulting impacts, but also to learn about their rights and the respective obligations and responsibilities of developers and deployers to protect and/or respect these rights. This paper will provide an overview of the study and practice of the participatory design process for AI, including inclusive impact assessment, its main elements, propose a framework, and discuss the lessons learned from the existing theory. In addition, it will explore pathways for enhancing and promoting individual and group rights through such engagement by discussing when, how, and whom to include, at which stage of the process, and what are the pre-requisites for meaningful and engaging. The overall aim is to ensure using the technology that works for the benefit of society, individuals, and particular (historically marginalised) groups.Keywords: rights-based design, AI impact assessment, inclusion, harm mitigation
Procedia PDF Downloads 1501389 Machine Learning Techniques in Seismic Risk Assessment of Structures
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine
Procedia PDF Downloads 1061388 CRYPTO COPYCAT: A Fashion Centric Blockchain Framework for Eliminating Fashion Infringement
Authors: Magdi Elmessiry, Adel Elmessiry
Abstract:
The fashion industry represents a significant portion of the global gross domestic product, however, it is plagued by cheap imitators that infringe on the trademarks which destroys the fashion industry's hard work and investment. While eventually the copycats would be found and stopped, the damage has already been done, sales are missed and direct and indirect jobs are lost. The infringer thrives on two main facts: the time it takes to discover them and the lack of tracking technologies that can help the consumer distinguish them. Blockchain technology is a new emerging technology that provides a distributed encrypted immutable and fault resistant ledger. Blockchain presents a ripe technology to resolve the infringement epidemic facing the fashion industry. The significance of the study is that a new approach leveraging the state of the art blockchain technology coupled with artificial intelligence is used to create a framework addressing the fashion infringement problem. It transforms the current focus on legal enforcement, which is difficult at best, to consumer awareness that is far more effective. The framework, Crypto CopyCat, creates an immutable digital asset representing the actual product to empower the customer with a near real time query system. This combination emphasizes the consumer's awareness and appreciation of the product's authenticity, while provides real time feedback to the producer regarding the fake replicas. The main findings of this study are that implementing this approach can delay the fake product penetration of the original product market, thus allowing the original product the time to take advantage of the market. The shift in the fake adoption results in reduced returns, which impedes the copycat market and moves the emphasis to the original product innovation.Keywords: fashion, infringement, blockchain, artificial intelligence, textiles supply chain
Procedia PDF Downloads 2601387 Antioxidant Mediated Neuroprotective Effects of Allium Cepa Extract Against Ischemia Reperfusion Induced Cognitive Dysfunction and Brain Damage in Mice
Authors: Jaspal Rana, Varinder Singh
Abstract:
Oxidative stress has been identified as an underlying cause of ischemia-reperfusion (IR) related cognitive dysfunction and brain damage. Therefore, antioxidant based therapies to treat IR injury are being investigated. Allium cepa L. (onion) is used as culinary medicine and is documented to have marked antioxidant effects. Hence, the present study was designed to evaluate the effect of A. cepa outer scale extract (ACE) against IR induced cognition and biochemical deficit in mice. ACE was prepared by maceration with 70% methanol and fractionated into ethylacetate and aqueous fractions. Bilateral common carotid artery occlusion for 10 min, followed by 24 h reperfusion, was used to induce cerebral IR injury. Following IR injury, ACE (100 and 200 mg/kg) was administered orally to animals for 7 days once daily. Behavioral outcomes (memory and sensorimotor functions) were evaluated using Morris water maze and neurological severity score. Cerebral infarct size, brain thiobarbituric acid reactive species, reduced glutathione, and superoxide dismutase activity were also determined. Treatment with ACE significantly ameliorated IR mediated deterioration of memory and sensorimotor functions and rose in brain oxidative stress in animals. The results of the present investigation revealed that ACE improved functional outcomes after cerebral IR injury which may be attributed to its antioxidant properties.Keywords: allium cepa, cerebral ischemia, memory, sensorimotor
Procedia PDF Downloads 1131386 Comparative Vector Susceptibility for Dengue Virus and Their Co-Infection in A. aegypti and A. albopictus
Authors: Monika Soni, Chandra Bhattacharya, Siraj Ahmed Ahmed, Prafulla Dutta
Abstract:
Dengue is now a globally important arboviral disease. Extensive vector surveillance has already established A.aegypti as a primary vector, but A.albopictus is now accelerating the situation through gradual adaptation to human surroundings. Global destabilization and gradual climatic shift with rising in temperature have significantly expanded the geographic range of these species These versatile vectors also host Chikungunya, Zika, and yellow fever virus. Biggest challenge faced by endemic countries now is upsurge in co-infection reported with multiple serotypes and virus co-circulation. To foster vector control interventions and mitigate disease burden, there is surge for knowledge on vector susceptibility and viral tolerance in response to multiple infections. To address our understanding on transmission dynamics and reproductive fitness, both the vectors were exposed to single and dual combinations of all four dengue serotypes by artificial feeding and followed up to third generation. Artificial feeding observed significant difference in feeding rate for both the species where A.albopictus was poor artificial feeder (35-50%) compared to A.aegypti (95-97%) Robust sequential screening of viral antigen in mosquitoes was followed by Dengue NS1 ELISA, RT-PCR and Quantitative PCR. To observe viral dissemination in different mosquito tissues Indirect immunofluorescence assay was performed. Result showed that both the vectors were infected initially with all dengue(1-4)serotypes and its co-infection (D1 and D2, D1 and D3, D1 and D4, D2 and D4) combinations. In case of DENV-2 there was significant difference in the peak titer observed at 16th day post infection. But when exposed to dual infections A.aegypti supported all combinations of virus where A.albopictus only continued single infections in successive days. There was a significant negative effect on the fecundity and fertility of both the vectors compared to control (PANOVA < 0.001). In case of dengue 2 infected mosquito, fecundity in parent generation was significantly higher (PBonferroni < 0.001) for A.albopicus compare to A.aegypti but there was a complete loss of fecundity from second to third generation for A.albopictus. It was observed that A.aegypti becomes infected with multiple serotypes frequently even at low viral titres compared to A.albopictus. Possible reason for this could be the presence of wolbachia infection in A.albopictus or mosquito innate immune response, small RNA interference etc. Based on the observations it could be anticipated that transovarial transmission may not be an important phenomenon for clinical disease outcome, due to the absence of viral positivity by third generation. Also, Dengue NS1 ELISA can be used for preliminary viral detection in mosquitoes as more than 90% of the samples were found positive compared to RT-PCR and viral load estimation.Keywords: co-infection, dengue, reproductive fitness, viral quantification
Procedia PDF Downloads 2011385 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 911384 AI-Based Information System for Hygiene and Safety Management of Shared Kitchens
Authors: Jongtae Rhee, Sangkwon Han, Seungbin Ji, Junhyeong Park, Byeonghun Kim, Taekyung Kim, Byeonghyeon Jeon, Jiwoo Yang
Abstract:
The shared kitchen is a concept that transfers the value of the sharing economy to the kitchen. It is a type of kitchen equipped with cooking facilities that allows multiple companies or chefs to share time and space and use it jointly. These shared kitchens provide economic benefits and convenience, such as reduced investment costs and rent, but also increase the risk of safety management, such as cross-contamination of food ingredients. Therefore, to manage the safety of food ingredients and finished products in a shared kitchen where several entities jointly use the kitchen and handle various types of food ingredients, it is critical to manage followings: the freshness of food ingredients, user hygiene and safety and cross-contamination of cooking equipment and facilities. In this study, it propose a machine learning-based system for hygiene safety and cross-contamination management, which are highly difficult to manage. User clothing management and user access management, which are most relevant to the hygiene and safety of shared kitchens, are solved through machine learning-based methodology, and cutting board usage management, which is most relevant to cross-contamination management, is implemented as an integrated safety management system based on artificial intelligence. First, to prevent cross-contamination of food ingredients, we use images collected through a real-time camera to determine whether the food ingredients match a given cutting board based on a real-time object detection model, YOLO v7. To manage the hygiene of user clothing, we use a camera-based facial recognition model to recognize the user, and real-time object detection model to determine whether a sanitary hat and mask are worn. In addition, to manage access for users qualified to enter the shared kitchen, we utilize machine learning based signature recognition module. By comparing the pairwise distance between the contract signature and the signature at the time of entrance to the shared kitchen, access permission is determined through a pre-trained signature verification model. These machine learning-based safety management tasks are integrated into a single information system, and each result is managed in an integrated database. Through this, users are warned of safety dangers through the tablet PC installed in the shared kitchen, and managers can track the cause of the sanitary and safety accidents. As a result of system integration analysis, real-time safety management services can be continuously provided by artificial intelligence, and machine learning-based methodologies are used for integrated safety management of shared kitchens that allows dynamic contracts among various users. By solving this problem, we were able to secure the feasibility and safety of the shared kitchen business.Keywords: artificial intelligence, food safety, information system, safety management, shared kitchen
Procedia PDF Downloads 691383 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs
Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.
Abstract:
Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification
Procedia PDF Downloads 1251382 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 1531381 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification
Authors: Meimei Shi
Abstract:
Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus
Procedia PDF Downloads 1391380 Detection of Atrial Fibrillation Using Wearables via Attentional Two-Stream Heterogeneous Networks
Authors: Huawei Bai, Jianguo Yao, Fellow, IEEE
Abstract:
Atrial fibrillation (AF) is the most common form of heart arrhythmia and is closely associated with mortality and morbidity in heart failure, stroke, and coronary artery disease. The development of single spot optical sensors enables widespread photoplethysmography (PPG) screening, especially for AF, since it represents a more convenient and noninvasive approach. To our knowledge, most existing studies based on public and unbalanced datasets can barely handle the multiple noises sources in the real world and, also, lack interpretability. In this paper, we construct a large- scale PPG dataset using measurements collected from PPG wrist- watch devices worn by volunteers and propose an attention-based two-stream heterogeneous neural network (TSHNN). The first stream is a hybrid neural network consisting of a three-layer one-dimensional convolutional neural network (1D-CNN) and two-layer attention- based bidirectional long short-term memory (Bi-LSTM) network to learn representations from temporally sampled signals. The second stream extracts latent representations from the PPG time-frequency spectrogram using a five-layer CNN. The outputs from both streams are fed into a fusion layer for the outcome. Visualization of the attention weights learned demonstrates the effectiveness of the attention mechanism against noise. The experimental results show that the TSHNN outperforms all the competitive baseline approaches and with 98.09% accuracy, achieves state-of-the-art performance.Keywords: PPG wearables, atrial fibrillation, feature fusion, attention mechanism, hyber network
Procedia PDF Downloads 1211379 Forecast Financial Bubbles: Multidimensional Phenomenon
Authors: Zouari Ezzeddine, Ghraieb Ikram
Abstract:
From the results of the academic literature which evokes the limitations of previous studies, this article shows the reasons for multidimensionality Prediction of financial bubbles. A new framework for modeling study predicting financial bubbles by linking a set of variable presented on several dimensions dictating its multidimensional character. It takes into account the preferences of financial actors. A multicriteria anticipation of the appearance of bubbles in international financial markets helps to fight against a possible crisis.Keywords: classical measures, predictions, financial bubbles, multidimensional, artificial neural networks
Procedia PDF Downloads 5771378 Innovation Management in E-Health Care: The Implementation of New Technologies for Health Care in Europe and the USA
Authors: Dariusz M. Trzmielak, William Bradley Zehner, Elin Oftedal, Ilona Lipka-Matusiak
Abstract:
The use of new technologies should create new value for all stakeholders in the healthcare system. The article focuses on demonstrating that technologies or products typically enable new functionality, a higher standard of service, or a higher level of knowledge and competence for clinicians. It also highlights the key benefits that can be achieved through the use of artificial intelligence, such as relieving clinicians of many tasks and enabling the expansion and greater specialisation of healthcare services. The comparative analysis allowed the authors to create a classification of new technologies in e-health according to health needs and benefits for patients, doctors, and healthcare systems, i.e., the main stakeholders in the implementation of new technologies and products in healthcare. The added value of the development of new technologies in healthcare is diagnosed. The work is both theoretical and practical in nature. The primary research methods are bibliographic analysis and analysis of research data and market potential of new solutions for healthcare organisations. The bibliographic analysis is complemented by the author's case studies of implemented technologies, mostly based on artificial intelligence or telemedicine. In the past, patients were often passive recipients, the end point of the service delivery system, rather than stakeholders in the system. One of the dangers of powerful new technologies is that patients may become even more marginalised. Healthcare will be provided and delivered in an increasingly administrative, programmed way. The doctor may also become a robot, carrying out programmed activities - using 'non-human services'. An alternative approach is to put the patient at the centre, using technologies, products, and services that allow them to design and control technologies based on their own needs. An important contribution to the discussion is to open up the different dimensions of the user (carer and patient) and to make them aware of healthcare units implementing new technologies. The authors of this article outline the importance of three types of patients in the successful implementation of new medical solutions. The impact of implemented technologies is analysed based on: 1) "Informed users", who are able to use the technology based on a better understanding of it; 2) "Engaged users" who play an active role in the broader healthcare system as a result of the technology; 3) "Innovative users" who bring their own ideas to the table based on a deeper understanding of healthcare issues. The authors' research hypothesis is that the distinction between informed, engaged, and innovative users has an impact on the perceived and actual quality of healthcare services. The analysis is based on case studies of new solutions implemented in different medical centres. In addition, based on the observations of the Polish author, who is a manager at the largest medical research institute in Poland, with analytical input from American and Norwegian partners, the added value of the implementations for patients, clinicians, and the healthcare system will be demonstrated.Keywords: innovation, management, medicine, e-health, artificial intelligence
Procedia PDF Downloads 201377 Leadership in the Era of AI: Growing Organizational Intelligence
Authors: Mark Salisbury
Abstract:
The arrival of artificially intelligent avatars and the automation they bring is worrying many of us, not only for our livelihood but for the jobs that may be lost to our kids. We worry about what our place will be as human beings in this new economy where much of it will be conducted online in the metaverse – in a network of 3D virtual worlds – working with intelligent machines. The Future of Leadership was written to address these fears and show what our place will be – the right place – in this new economy of AI avatars, automation, and 3D virtual worlds. But to be successful in this new economy, our job will be to bring wisdom to our workplace and the marketplace. And we will use AI avatars and 3D virtual worlds to do it. However, this book is about more than AI and the avatars that we will work with in the metaverse. It’s about building Organizational intelligence (OI) -- the capability of an organization to comprehend and create knowledge relevant to its purpose; in other words, it is the intellectual capacity of the entire organization. To increase organizational intelligence requires a new kind of knowledge worker, a wisdom worker, that requires a new kind of leadership. This book begins your story for how to become a leader of wisdom workers and be successful in the emerging wisdom economy. After this presentation, conference participants will be able to do the following: Recognize the characteristics of the new generation of wisdom workers and how they differ from their predecessors. Recognize that new leadership methods and techniques are needed to lead this new generation of wisdom workers. Apply personal and professional values – personal integrity, belief in something larger than yourself, and keeping the best interest of others in mind – to improve your work performance and lead others. Exhibit an attitude of confidence, courage, and reciprocity of sharing knowledge to increase your productivity and influence others. Leverage artificial intelligence to accelerate your ability to learn, augment your decision-making, and influence others.Utilize new technologies to communicate with human colleagues and intelligent machines to develop better solutions more quickly.Keywords: metaverse, generative artificial intelligence, automation, leadership, organizational intelligence, wisdom worker
Procedia PDF Downloads 431376 Comparison of Surface Hardness of Filling Material Glass Ionomer Cement Which Soaked in Alcohol Containing Mouthwash and Alcohol-Free Mouthwash
Authors: Farid Yuristiawan, Aulina R. Rahmi, Detty Iryani, Gunawan
Abstract:
Glass ionomer cement is one of the filling material that often used in the field of dentistry because it is relatively less expensive and mostly available. Surface hardness is one of the most important properties of restoration material; it is the ability of material to stand against indentation, which is directly connected to the material compressive strength and its ability to withstand abrasion. The higher surface hardness of a material means it is better to withstand abrasion. The existence of glass ionomer cement in the mouth makes it susceptible to any substance that comes into mouth, one of them is mouthwash which is a solution that used for many purposes such as antiseptic, astringent, to prevent caries, and bad breath. The presence of alcohol in mouthwash could affect the properties of glass ionomer cement, surface hardness. Objective: To determine the comparison of surface hardness of glass ionomer cement which soaked in alcohol containing mouthwash and alcohol-free mouthwash. Methods: This research is a laboratory experimental type study. There were 30 samples made from GC FUJI IX GP EXTRA and then soaked in artificial saliva for the first 24 hours inside incubator which temperature and humidity were controlled. Samples then divided into three groups. The first group will be soaked in alcohol-containing mouthwash; second group will be soaked alcohol-free mouthwash and control group will be soaked in artificial saliva for 6 hours inside incubator. Listerine is the mouthwash that was used on this research and surface hardness was examined using Vickers Hardness Tester. The result of this research shows mean value for surface hardness of the first group is 16.36 VHN, 24.04 VHN for second group, and 43.60 VHN for control group. The result one way ANOVA with post hoc Bonferroni comparing test show significant results p = 0.00. Conclusions: The data showed there were statistically significant differences of surface hardness between each group, which surface hardness of the first group is lower than the second group, and both surface hardness of the first (alcohol mouthwash) and second group (alcohol-free mouthwash) are lowered than control group (p = 0.00).Keywords: glass ionomer cement, mouthwash, surface hardness, Vickers hardness tester
Procedia PDF Downloads 2241375 Safeguarding the Construction Industry: Interrogating and Mitigating Emerging Risks from AI in Construction
Authors: Abdelrhman Elagez, Rolla Monib
Abstract:
This empirical study investigates the observed risks associated with adopting Artificial Intelligence (AI) technologies in the construction industry and proposes potential mitigation strategies. While AI has transformed several industries, the construction industry is slowly adopting advanced technologies like AI, introducing new risks that lack critical analysis in the current literature. A comprehensive literature review identified a research gap, highlighting the lack of critical analysis of risks and the need for a framework to measure and mitigate the risks of AI implementation in the construction industry. Consequently, an online survey was conducted with 24 project managers and construction professionals, possessing experience ranging from 1 to 30 years (with an average of 6.38 years), to gather industry perspectives and concerns relating to AI integration. The survey results yielded several significant findings. Firstly, respondents exhibited a moderate level of familiarity (66.67%) with AI technologies, while the industry's readiness for AI deployment and current usage rates remained low at 2.72 out of 5. Secondly, the top-ranked barriers to AI adoption were identified as lack of awareness, insufficient knowledge and skills, data quality concerns, high implementation costs, absence of prior case studies, and the uncertainty of outcomes. Thirdly, the most significant risks associated with AI use in construction were perceived to be a lack of human control (decision-making), accountability, algorithm bias, data security/privacy, and lack of legislation and regulations. Additionally, the participants acknowledged the value of factors such as education, training, organizational support, and communication in facilitating AI integration within the industry. These findings emphasize the necessity for tailored risk assessment frameworks, guidelines, and governance principles to address the identified risks and promote the responsible adoption of AI technologies in the construction sector.Keywords: risk management, construction, artificial intelligence, technology
Procedia PDF Downloads 981374 Bridging Minds and Nature: Revolutionizing Elementary Environmental Education Through Artificial Intelligence
Authors: Hoora Beheshti Haradasht, Abooali Golzary
Abstract:
Environmental education plays a pivotal role in shaping the future stewards of our planet. Leveraging the power of artificial intelligence (AI) in this endeavor presents an innovative approach to captivate and educate elementary school children about environmental sustainability. This paper explores the application of AI technologies in designing interactive and personalized learning experiences that foster curiosity, critical thinking, and a deep connection to nature. By harnessing AI-driven tools, virtual simulations, and personalized content delivery, educators can create engaging platforms that empower children to comprehend complex environmental concepts while nurturing a lifelong commitment to protecting the Earth. With the pressing challenges of climate change and biodiversity loss, cultivating an environmentally conscious generation is imperative. Integrating AI in environmental education revolutionizes traditional teaching methods by tailoring content, adapting to individual learning styles, and immersing students in interactive scenarios. This paper delves into the potential of AI technologies to enhance engagement, comprehension, and pro-environmental behaviors among elementary school children. Modern AI technologies, including natural language processing, machine learning, and virtual reality, offer unique tools to craft immersive learning experiences. Adaptive platforms can analyze individual learning patterns and preferences, enabling real-time adjustments in content delivery. Virtual simulations, powered by AI, transport students into dynamic ecosystems, fostering experiential learning that goes beyond textbooks. AI-driven educational platforms provide tailored content, ensuring that environmental lessons resonate with each child's interests and cognitive level. By recognizing patterns in students' interactions, AI algorithms curate customized learning pathways, enhancing comprehension and knowledge retention. Utilizing AI, educators can develop virtual field trips and interactive nature explorations. Children can navigate virtual ecosystems, analyze real-time data, and make informed decisions, cultivating an understanding of the delicate balance between human actions and the environment. While AI offers promising educational opportunities, ethical concerns must be addressed. Safeguarding children's data privacy, ensuring content accuracy, and avoiding biases in AI algorithms are paramount to building a trustworthy learning environment. By merging AI with environmental education, educators can empower children not only with knowledge but also with the tools to become advocates for sustainable practices. As children engage in AI-enhanced learning, they develop a sense of agency and responsibility to address environmental challenges. The application of artificial intelligence in elementary environmental education presents a groundbreaking avenue to cultivate environmentally conscious citizens. By embracing AI-driven tools, educators can create transformative learning experiences that empower children to grasp intricate ecological concepts, forge an intimate connection with nature, and develop a strong commitment to safeguarding our planet for generations to come.Keywords: artificial intelligence, environmental education, elementary children, personalized learning, sustainability
Procedia PDF Downloads 821373 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 171372 Low Enrollment in Civil Engineering Departments: Challenges and Opportunities
Authors: Alaa Yehia, Ayatollah Yehia, Sherif Yehia
Abstract:
There is a recurring issue of low enrollments across many civil engineering departments in postsecondary institutions. While there have been moments where enrollments begin to increase, civil engineering departments find themselves facing low enrollments at around 60% over the last five years across the Middle East. There are many reasons that could be attributed to this decline, such as low entry-level salaries, over-saturation of civil engineering graduates in the job market, and a lack of construction projects due to the impending or current recession. However, this recurring problem alludes to an intrinsic issue of the curriculum. The societal shift to the usage of high technology such as machine learning (ML) and artificial intelligence (AI) demands individuals who are proficient at utilizing it. Therefore, existing curriculums must adapt to this change in order to provide an education that is suitable for potential and current students. In this paper, In order to provide potential solutions for this issue, the analysis considers two possible implementations of high technology into the civil engineering curriculum. The first approach is to implement a course that introduces applications of high technology in Civil Engineering contexts. While the other approach is to intertwine applications of high technology throughout the degree. Both approaches, however, should meet requirements of accreditation agencies. In addition to the proposed improvement in civil engineering curriculum, a different pedagogical practice must be adapted as well. The passive learning approach might not be appropriate for Gen Z students; current students, now more than ever, need to be introduced to engineering topics and practice following different learning methods to ensure they will have the necessary skills for the job market. Different learning methods that incorporate high technology applications, like AI, must be integrated throughout the curriculum to make the civil engineering degree more attractive to prospective students. Moreover, the paper provides insight on the importance and approach of adapting the Civil Engineering curriculum to address the current low enrollment crisis that civil engineering departments globally, but specifically in the Middle East, are facing.Keywords: artificial intelligence (AI), civil engineering curriculum, high technology, low enrollment, pedagogy
Procedia PDF Downloads 1661371 Efficacy of Erector Spinae Plane Block for Postoperative Pain Management in Coronary Artery Bypass Graft Patients
Authors: Santosh Sharma Parajuli, Diwas Manandhar
Abstract:
Background: Perioperative pain management plays an integral part in patients undergoing cardiac surgery. We studied the effect of Erector Spinae Plane block on acute postoperative pain reduction and 24 hours opioid consumption in adult cardiac surgical patients. Methods: Twenty-five adult cardiac surgical patients who underwent cardiac surgery with sternotomy in whom ESP catheters were placed preoperatively were kept in group E, and the other 25 patients who had undergone cardiac surgery without ESP catheter and pain management done with conventional opioid injection were placed in group C. Fentanyl was used for pain management. The primary study endpoint was to compare the consumption of fentanyl and to assess the numeric rating scale in the postoperative period in the first 24 hours in both groups. Results: The 24 hours fentanyl consumption was 43.00±51.29 micrograms in the Erector Spinae Plane catheter group and 147.00±60.94 micrograms in the control group postoperatively which was statistically significant (p <0.001). The numeric rating scale was also significantly reduced in the Erector Spinae Plane group compared to the control group in the first 24 hours postoperatively. Conclusion: Erector Spinae Plane block is superior to the conventional opioid injection method for postoperative pain management in CABG patients. Erector Spinae Plane block not only decreases the overall opioid consumption but also the NRS score in these patients.Keywords: erector, spinae, plane, numerical rating scale
Procedia PDF Downloads 661370 The Efficacy of Box Lesion+ Procedure in Patients with Atrial Fibrillation: Two-Year Follow-up Results
Authors: Oleg Sapelnikov, Ruslan Latypov, Darina Ardus, Samvel Aivazian, Andrey Shiryaev, Renat Akchurin
Abstract:
OBJECTIVE: MAZE procedure is one of the most effective surgical methods in atrial fibrillation (AF) treatment. Nowadays we are all aware of its modifications. In our study we conducted clinical analysis of “Box lesion+” approach during MAZE procedure in two-year follow-up. METHODS: We studied the results of the open-heart on-pump procedures performed in our hospital from 2017 to 2018 years. Thirty-two (32) patients with atrial fibrillation (AF) were included in this study. Fifteen (15) patients had concomitant coronary bypass grafting and seventeen (17) patients had mitral valve repair. Mean age was 62.3±8.7 years; prevalence of men was admitted (56.1%). Mean duration of AF was 4.75±5.44 and 7.07±8.14 years. In all cases, we performed endocardial Cryo-MAZE procedure with one-time myocardium revascularization or mitral-valve surgery. All patients of this study underwent pulmonary vein (PV) isolation and ablation of mitral isthmus with additional isolation of LA posterior wall (Box-lesion+ procedure). Mean follow-up was 2 years. RESULTS: All cases were performed without any complications. Additional isolation of posterior wall did not prolong the operative time and artificial circulation significantly. Cryo-MAZE procedure directly lasted 20±2.1 min, the whole operation time was 192±24 min and artificial circulation time was 103±12 min. According to design of the study, we performed clinical investigation of the patients in 12 months and in 2 years from the initial procedure. In 12 months, the number of AF free patients 81.8% and 75.8% in two years of follow-up. CONCLUSIONS: Isolation of the left atrial posterior wall and perimitral area may considerably improve the efficacy of surgical treatment, which was demonstrated in significant decrease of AF recurrences during the whole period of follow-up.Keywords: atrial fibrillation, cryoablation, left atrium isolation, open heart procedure
Procedia PDF Downloads 1261369 Comparison of GIS-Based Soil Erosion Susceptibility Models Using Support Vector Machine, Binary Logistic Regression and Artificial Neural Network in the Southwest Amazon Region
Authors: Elaine Lima Da Fonseca, Eliomar Pereira Da Silva Filho
Abstract:
The modeling of areas susceptible to soil loss by hydro erosive processes consists of a simplified instrument of reality with the purpose of predicting future behaviors from the observation and interaction of a set of geoenvironmental factors. The models of potential areas for soil loss will be obtained through binary logistic regression, artificial neural networks, and support vector machines. The choice of the municipality of Colorado do Oeste in the south of the western Amazon is due to soil degradation due to anthropogenic activities, such as agriculture, road construction, overgrazing, deforestation, and environmental and socioeconomic configurations. Initially, a soil erosion inventory map constructed through various field investigations will be designed, including the use of remotely piloted aircraft, orbital imagery, and the PLANAFLORO/RO database. 100 sampling units with the presence of erosion will be selected based on the assumptions indicated in the literature, and, to complement the dichotomous analysis, 100 units with no erosion will be randomly designated. The next step will be the selection of the predictive parameters that exert, jointly, directly, or indirectly, some influence on the mechanism of occurrence of soil erosion events. The chosen predictors are altitude, declivity, aspect or orientation of the slope, curvature of the slope, composite topographic index, flow power index, lineament density, normalized difference vegetation index, drainage density, lithology, soil type, erosivity, and ground surface temperature. After evaluating the relative contribution of each predictor variable, the erosion susceptibility model will be applied to the municipality of Colorado do Oeste - Rondônia through the SPSS Statistic 26 software. Evaluation of the model will occur through the determination of the values of the R² of Cox & Snell and the R² of Nagelkerke, Hosmer and Lemeshow Test, Log Likelihood Value, and Wald Test, in addition to analysis of the Confounding Matrix, ROC Curve and Accumulated Gain according to the model specification. The validation of the synthesis map resulting from both models of the potential risk of soil erosion will occur by means of Kappa indices, accuracy, and sensitivity, as well as by field verification of the classes of susceptibility to erosion using drone photogrammetry. Thus, it is expected to obtain the mapping of the following classes of susceptibility to erosion very low, low, moderate, very high, and high, which may constitute a screening tool to identify areas where more detailed investigations need to be carried out, applying more efficient social resources.Keywords: modeling, susceptibility to erosion, artificial intelligence, Amazon
Procedia PDF Downloads 661368 Impact of Water Storage Structures on Groundwater Recharge in Jeloula Basin, Central Tunisia
Abstract:
An attempt has been made to examine the effect of water storage structures on groundwater recharge in a semi-arid agroclimatic setting in Jeloula Basin (Central Tunisia). In this area, surface water in rivers is seasonal, and therefore groundwater is the perennial source of water supply for domestic and agricultural purposes. Three pumped storage water power plants (PSWPP) have been built to increase the overall water availability in the basin and support agricultural livelihoods of rural smallholders. The scale and geographical dispersion of these multiple lakes restrict the understanding of these coupled human-water systems and the identification of adequate strategies to support riparian farmers. In the present review, hydrochemistry and isotopic tools were combined to get an insight into the processes controlling mineralization and recharge conditions in the investigated aquifer system. This study showed a slight increase in the groundwater level, especially after the artificial recharge operations and a decline when the water volume moves down during drought periods. Chemical data indicate that the main sources of salinity in the waters are related to water-rock interactions. Data inferred from stable isotopes in groundwater samples indicated recharge with modern rainfall. The investigated surface water samples collected from the PSWPP are affected by a significant evaporation and reveal large seasonal variations, which could be controlled by the water volume changes in the open surface reservoirs and the meteorological conditions during evaporation, condensation, and precipitation. The geochemical information is comparable to the isotopic results and illustrates that the chemical and isotopic signatures of reservoir waters differ clearly from those of groundwaters. These data confirm that the contribution of the artificial recharge operations from the PSWPP is very limited.Keywords: Jeloula basin, recharge, hydrochemistry, isotopes
Procedia PDF Downloads 1521367 Robotic Lingulectomy for Primary Lung Cancer: A Video Presentation
Authors: Abraham J. Rizkalla, Joanne F. Irons, Christopher Q. Cao
Abstract:
Purpose: Lobectomy was considered the standard of care for early-stage non-small lung cancer (NSCLC) after the Lung Cancer Study Group trial demonstrated increased locoregional recurrence for sublobar resections. However, there has been heightened interest in segmentectomies for selected patients with peripheral lesions ≤2cm, as investigated by the JCOG0802 and CALGB140503 trials. Minimally invasive robotic surgery facilitates segmentectomies with improved maneuverability and visualization of intersegmental planes using indocyanine green. We hereby present a patient who underwent robotic lingulectomy for an undiagnosed ground-glass opacity. Methodology: This video demonstrates a robotic portal lingulectomy using three 8mm ports and a 12mm port. Stereoscopic direct vision facilitated the identification of the lingula artery and vein, and intra-operative bronchoscopy was performed to confirm the lingula bronchus. The intersegmental plane was identified by indocyanine green and a near-infrared camera. Thorough lymph node sampling was performed in accordance with international standards. Results: The 18mm lesion was successfully excised with clear margins to achieve R0 resection with no evidence of malignancy in the 8 lymph nodes sampled. Histopathological examination revealed lepidic predominant adenocarcinoma, pathological stage IA. Conclusion: This video presentation exemplifies the standard approach for robotic portal lingulectomy in appropriately selected patients.Keywords: lung cancer, robotic segmentectomy, indocyanine green, lingulectomy
Procedia PDF Downloads 671366 Enhancing Efficiency of Building through Translucent Concrete
Authors: Humaira Athar, Brajeshwar Singh
Abstract:
Generally, the brightness of the indoor environment of buildings is entirely maintained by the artificial lighting which has consumed a large amount of resources. It is reported that lighting consumes about 19% of the total generated electricity which accounts for about 30-40% of total energy consumption. One possible way is to reduce the lighting energy by exploiting sunlight either through the use of suitable devices or energy efficient materials like translucent concrete. Translucent concrete is one such architectural concrete which allows the passage of natural light as well as artificial light through it. Several attempts have been made on different aspects of translucent concrete such as light guiding materials (glass fibers, plastic fibers, cylinder etc.), concrete mix design and manufacturing methods for use as building elements. Concerns are, however, raised on various related issues such as poor compatibility between the optical fibers and cement paste, unaesthetic appearance due to disturbance occurred in the arrangement of fibers during vibration and high shrinkage in flowable concrete due to its high water/cement ratio. Need is felt to develop translucent concrete to meet the requirement of structural safety as OPC concrete with the maximized saving in energy towards the power of illumination and thermal load in buildings. Translucent concrete was produced using pre-treated plastic optical fibers (POF, 2mm dia.) and high slump white concrete. The concrete mix was proportioned in the ratio of 1:1.9:2.1 with a w/c ratio of 0.40. The POF was varied from 0.8-9 vol.%. The mechanical properties and light transmission of this concrete were determined. Thermal conductivity of samples was measured by a transient plate source technique. Daylight illumination was measured by a lux grid method as per BIS:SP-41. It was found that the compressive strength of translucent concrete increased with decreasing optical fiber content. An increase of ~28% in the compressive strength of concrete was noticed when fiber was pre-treated. FE-SEM images showed little-debonded zone between the fibers and cement paste which was well supported with pull-out bond strength test results (~187% improvement over untreated). The light transmission of concrete was in the range of 3-7% depending on fiber spacing (5-20 mm). The average daylight illuminance (~75 lux) was nearly equivalent to the criteria specified for illumination for circulation (80 lux). The thermal conductivity of translucent concrete was reduced by 28-40% with respect to plain concrete. The thermal load calculated by heat conduction equation was ~16% more than the plain concrete. Based on Design-Builder software, the total annual illumination energy load of a room using one side translucent concrete was 162.36 kW compared with the energy load of 249.75 kW for a room without concrete. The calculated energy saving on an account of the power of illumination was ~25%. A marginal improvement towards thermal comfort was also noticed. It is concluded that the translucent concrete has the advantages of the existing concrete (load bearing) with translucency and insulation characteristics. It saves a significant amount of energy by providing natural daylight instead of artificial power consumption of illumination.Keywords: energy saving, light transmission, microstructure, plastic optical fibers, translucent concrete
Procedia PDF Downloads 128