Search results for: deep vein imaging
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3306

Search results for: deep vein imaging

2556 Challenges of Management of Subaortic Membrane in a Young Adult Patient: A Case Review and Literature Review

Authors: Talal Asif, Maya Kosinska, Lucas Georger, Krish Sardesai, Muhammad Shah Miran

Abstract:

This article presents a case review and literature review focused on the challenges of managing subaortic membranes (SAM) in young adult patients with mild aortic regurgitation (AR) or aortic stenosis (AS). The study aims to discuss the diagnosis of SAM, imaging studies used for assessment, management strategies in young patients, the risk of valvular damage, and the controversy surrounding prophylactic resection in mild AR. The management of SAM in adults poses challenges due to limited treatment options and potential complications, necessitating further investigation into the progression of AS and AR in asymptomatic SAM patients. The case presentation describes a 40-year-old male with muscular dystrophy who presented with symptoms and was diagnosed with SAM. Various imaging techniques, including CT chest, transthoracic echocardiogram (TTE), and transesophageal echocardiogram (TEE), were used to confirm the presence and severity of SAM. Based on the patient's clinical profile and the absence of surgical indications, medical therapy was initiated, and regular outpatient follow-up was recommended to monitor disease progression. The discussion highlights the challenges in diagnosing SAM, the importance of imaging studies, and the potential complications associated with SAM in young patients. The article also explores the management options for SAM, emphasizing surgical resection as the definitive treatment while acknowledging the limited success rates of alternative approaches. Close monitoring and prompt intervention for complications are crucial in the management of SAM. The concluding statement emphasizes the need for further research to explore alternative treatments for SAM in young patients.

Keywords: subaortic membrane, management, case report, literature review, aortic regurgitation, aortic stenosis, left ventricular outflow obstruction, guidelines, heart failure

Procedia PDF Downloads 78
2555 Geology and Geochemistry of the Paleozoic Basement, Western Algeria

Authors: Hadj Mohamed Nacera, Boutaleb Abdelhak

Abstract:

The Hercynian granite in Western Algeria, has a typical high-K calc-alkaline evolution, with peraluminous trend U-Pb zircon geochronology yielded the minimum emplacement age of 297 ± 1 Ma. It shows dark microgranular enclaves, veins of pegmatite, aplite, tourmaline and quartz. The granite plutons selected for this study are formed during the late Variscian phase and intrudes the Lower Silurian metasediments which were affected by the major Hercynian folding phases. An important Quartz vein field cross-cutting metasedimentary and granitic rocks. Invisible gold occurs in a very small arsenopyrite minerals. The purpose of this study is to highlight the relationship between the gold mineralisation and the intrusion by combining petrographic and geochemic studies.

Keywords: Algeria, basement, geochemestry, granite

Procedia PDF Downloads 251
2554 Comparative Analysis of Reinforcement Learning Algorithms for Autonomous Driving

Authors: Migena Mana, Ahmed Khalid Syed, Abdul Malik, Nikhil Cherian

Abstract:

In recent years, advancements in deep learning enabled researchers to tackle the problem of self-driving cars. Car companies use huge datasets to train their deep learning models to make autonomous cars a reality. However, this approach has certain drawbacks in that the state space of possible actions for a car is so huge that there cannot be a dataset for every possible road scenario. To overcome this problem, the concept of reinforcement learning (RL) is being investigated in this research. Since the problem of autonomous driving can be modeled in a simulation, it lends itself naturally to the domain of reinforcement learning. The advantage of this approach is that we can model different and complex road scenarios in a simulation without having to deploy in the real world. The autonomous agent can learn to drive by finding the optimal policy. This learned model can then be easily deployed in a real-world setting. In this project, we focus on three RL algorithms: Q-learning, Deep Deterministic Policy Gradient (DDPG), and Proximal Policy Optimization (PPO). To model the environment, we have used TORCS (The Open Racing Car Simulator), which provides us with a strong foundation to test our model. The inputs to the algorithms are the sensor data provided by the simulator such as velocity, distance from side pavement, etc. The outcome of this research project is a comparative analysis of these algorithms. Based on the comparison, the PPO algorithm gives the best results. When using PPO algorithm, the reward is greater, and the acceleration, steering angle and braking are more stable compared to the other algorithms, which means that the agent learns to drive in a better and more efficient way in this case. Additionally, we have come up with a dataset taken from the training of the agent with DDPG and PPO algorithms. It contains all the steps of the agent during one full training in the form: (all input values, acceleration, steering angle, break, loss, reward). This study can serve as a base for further complex road scenarios. Furthermore, it can be enlarged in the field of computer vision, using the images to find the best policy.

Keywords: autonomous driving, DDPG (deep deterministic policy gradient), PPO (proximal policy optimization), reinforcement learning

Procedia PDF Downloads 123
2553 O-(2-18F-Fluoroethyl)-L-Tyrosine Positron Emission Tomography/Computed Tomography in Patients with Suspicious Recurrent Low and High-Grade Glioma

Authors: Mahkameh Asadi, Habibollah Dadgar

Abstract:

The precise definition margin of high and low-grade glioma is crucial for choosing best treatment approach after surgery and radio-chemotherapy. The aim of the current study was to assess the O-(2-18F-fluoroethyl)-L-tyrosine (18F-FET) positron emission tomography (PET)/computed tomography (CT) in patients with low (LGG) and high grade glioma (HGG). We retrospectively analyzed 18F-FET PET/CT of 10 patients (age: 33 ± 12 years) with suspicious for recurrent LGG and HGG. The final decision of recurrence was made by magnetic resonance imaging (MRI) and registered clinical data. While response to radio-chemotherapy by MRI is often complex and sophisticated due to the edema, necrosis, and inflammation, emerging amino acid PET leading to better interpretations with more specifically differentiate true tumor boundaries from equivocal lesions. Therefore, integrating amino acid PET in the management of glioma to complement MRI will significantly improve early therapy response assessment, treatment planning, and clinical trial design.

Keywords: positron emission tomography, amino acid positron emission tomography, magnetic resonance imaging, low and high grade glioma

Procedia PDF Downloads 149
2552 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging

Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi

Abstract:

Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.

Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA

Procedia PDF Downloads 261
2551 IoT and Deep Learning approach for Growth Stage Segregation and Harvest Time Prediction of Aquaponic and Vermiponic Swiss Chards

Authors: Praveen Chandramenon, Andrew Gascoyne, Fideline Tchuenbou-Magaia

Abstract:

Aquaponics offers a simple conclusive solution to the food and environmental crisis of the world. This approach combines the idea of Aquaculture (growing fish) to Hydroponics (growing vegetables and plants in a soilless method). Smart Aquaponics explores the use of smart technology including artificial intelligence and IoT, to assist farmers with better decision making and online monitoring and control of the system. Identification of different growth stages of Swiss Chard plants and predicting its harvest time is found to be important in Aquaponic yield management. This paper brings out the comparative analysis of a standard Aquaponics with a Vermiponics (Aquaponics with worms), which was grown in the controlled environment, by implementing IoT and deep learning-based growth stage segregation and harvest time prediction of Swiss Chards before and after applying an optimal freshwater replenishment. Data collection, Growth stage classification and Harvest Time prediction has been performed with and without water replenishment. The paper discusses the experimental design, IoT and sensor communication with architecture, data collection process, image segmentation, various regression and classification models and error estimation used in the project. The paper concludes with the results comparison, including best models that performs growth stage segregation and harvest time prediction of the Aquaponic and Vermiponic testbed with and without freshwater replenishment.

Keywords: aquaponics, deep learning, internet of things, vermiponics

Procedia PDF Downloads 51
2550 Treatment and Diagnostic Imaging Methods of Fetal Heart Function in Radiology

Authors: Mahdi Farajzadeh Ajirlou

Abstract:

Prior evidence of normal cardiac anatomy is desirable to relieve the anxiety of cases with a family history of congenital heart disease or to offer the option of early gestation termination or close follow-up should a cardiac anomaly be proved. Fetal heart discovery plays an important part in the opinion of the fetus, and it can reflect the fetal heart function of the fetus, which is regulated by the central nervous system. Acquisition of ventricular volume and inflow data would be useful to quantify more valve regurgitation and ventricular function to determine the degree of cardiovascular concession in fetal conditions at threat for hydrops fetalis. This study discusses imaging the fetal heart with transvaginal ultrasound, Doppler ultrasound, three-dimensional ultrasound (3DUS) and four-dimensional (4D) ultrasound, spatiotemporal image correlation (STIC), glamorous resonance imaging and cardiac catheterization. Doppler ultrasound (DUS) image is a kind of real- time image with a better imaging effect on blood vessels and soft tissues. DUS imaging can observe the shape of the fetus, but it cannot show whether the fetus is hypoxic or distressed. Spatiotemporal image correlation (STIC) enables the acquisition of a volume of data concomitant with the beating heart. The automated volume accession is made possible by the array in the transducer performing a slow single reach, recording a single 3D data set conforming to numerous 2D frames one behind the other. The volume accession can be done in a stationary 3D, either online 4D (direct volume scan, live 3D ultrasound or a so-called 4D (3D/ 4D)), or either spatiotemporal image correlation-STIC (off-line 4D, which is a circular volume check-up). Fetal cardiovascular MRI would appear to be an ideal approach to the noninvasive disquisition of the impact of abnormal cardiovascular hemodynamics on antenatal brain growth and development. Still, there are practical limitations to the use of conventional MRI for fetal cardiovascular assessment, including the small size and high heart rate of the mortal fetus, the lack of conventional cardiac gating styles to attend data accession, and the implicit corruption of MRI data due to motherly respiration and unpredictable fetal movements. Fetal cardiac MRI has the implicit to complement ultrasound in detecting cardiovascular deformations and extracardiac lesions. Fetal cardiac intervention (FCI), minimally invasive catheter interventions, is a new and evolving fashion that allows for in-utero treatment of a subset of severe forms of congenital heart deficiency. In special cases, it may be possible to modify the natural history of congenital heart disorders. It's entirely possible that future generations will ‘repair’ congenital heart deficiency in utero using nanotechnologies or remote computer-guided micro-robots that work in the cellular layer.

Keywords: fetal, cardiac MRI, ultrasound, 3D, 4D, heart disease, invasive, noninvasive, catheter

Procedia PDF Downloads 9
2549 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis

Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen

Abstract:

The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluate the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.

Keywords: convolutional neural network, electronic medical record, feature representation, lexical semantics, semantic decision

Procedia PDF Downloads 116
2548 Perception and Control in the Age of Surrealism: A Critical History and a Survey of Pita Amor’s Poetic Ontology

Authors: Oliver Arana

Abstract:

Within the common vein of social understanding, surrealism is often understood to rely on disconcerting images and fragmented collage, both in its visual representation and literary manifestations. By tracing the history and literature of surrealism, the author makes the argument that there were certain factions within Latin America that employed characteristics of surrealism in order to reach some sense of understanding, and not to further complicate or disorient -an aim that most closely aligns to Freudian psychoanalysis. Psychoanalysis should, however, be a comparable practice only to understand how Latin American surrealism had more of a concrete goal than its European counterpart. The primary subject of the paper is the Mexican poet, Pita Amor, who has retroactively been associated with the movement; and therefore, it should be duly noted that the adjective, surrealism, only applies to her as something that describes traits within the literary lexicon.

Keywords: Latin America, Pita Amor, poetry, surrealism

Procedia PDF Downloads 125
2547 A U-Net Based Architecture for Fast and Accurate Diagram Extraction

Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal

Abstract:

In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.

Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO

Procedia PDF Downloads 113
2546 Artificial Intelligence in Bioscience: The Next Frontier

Authors: Parthiban Srinivasan

Abstract:

With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.

Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction

Procedia PDF Downloads 339
2545 In Vitro Anthelmintic Effects of Citrullus colocynthis Fruit Extract on Fasciola gigantica of Domestic Buffalo (Bubalus bubalis) in Udaipur, India

Authors: Rajnarayan Damor, Gayatri Swarnakar

Abstract:

Fasciola gigantica are present in the biliary ducts of liver and gall bladder of domestic buffaloes. They are very harmful and causes significant lose to live stock owners, on account of poor growth and lower productivity of domestic buffaloes. Synthetic veterinary drugs have been used to eliminate parasites from cattle but these drugs are unaffordable and inaccessible for poor cattle farmers. The in vitro anthelmintic effect of Citrullus colocynthis fruit extract against Fasciola gigantica parasites were observed by light and scanning electron microscopy. Fruit extracts of C. colocynthis exhibit highest mortality 100% at 50 mg/ml in 15th hour of exposure. The oral and ventral sucker appeared to be slightly more swollen than control and synthetic drug albendazole. The tegument showed submerged spines by the swollen tegument around them. The tegument of the middle region showed deep furrows, folding and submerged spines which either lied very flat against the surface or had become submerged in the tegument by the swollen tegument around them leaving deep furrows. Posterior region showed with deep folding in the tegument, completely disappearance of spines and swelling of the tegument led to completely submerged spines leaving spine socket. The present study revealed that fruit extracts of Citrullus colocynthis found to be potential sources for novel anthelmintic and justify their ethno-veterinary use.

Keywords: anthelmintic, buffalo, Citrullus colocynthis, Fasciola gigantica, mortality, tegument

Procedia PDF Downloads 215
2544 Predicting Shortage of Hospital Beds during COVID-19 Pandemic in United States

Authors: Saba Ebrahimi, Saeed Ahmadian, Hedie Ashrafi

Abstract:

World-wide spread of coronavirus grows the concern about planning for the excess demand of hospital services in response to COVID-19 pandemic. The surge in the hospital services demand beyond the current capacity leads to shortage of ICU beds and ventilators in some parts of US. In this study, we forecast the required number of hospital beds and possible shortage of beds in US during COVID-19 pandemic to be used in the planning and hospitalization of new cases. In this paper, we used a data on COVID-19 deaths and patients’ hospitalization besides the data on hospital capacities and utilization in US from publicly available sources and national government websites. we used a novel ensemble modelling of deep learning networks, based on stacking different linear and non-linear layers to predict the shortage in hospital beds. The results showed that our proposed approach can predict the excess hospital beds demand very well and this can be helpful in developing strategies and plans to mitigate this gap.

Keywords: COVID-19, deep learning, ensembled models, hospital capacity planning

Procedia PDF Downloads 137
2543 Malignancy Assessment of Brain Tumors Using Convolutional Neural Network

Authors: Chung-Ming Lo, Kevin Li-Chun Hsieh

Abstract:

The central nervous system in the World Health Organization defines grade 2, 3, 4 gliomas according to the aggressiveness. For brain tumors, using image examination would have a lower risk than biopsy. Besides, it is a challenge to extract relevant tissues from biopsy operation. Observing the whole tumor structure and composition can provide a more objective assessment. This study further proposed a computer-aided diagnosis (CAD) system based on a convolutional neural network to quantitatively evaluate a tumor's malignancy from brain magnetic resonance imaging. A total of 30 grade 2, 43 grade 3, and 57 grade 4 gliomas were collected in the experiment. Transferred parameters from AlexNet were fine-tuned to classify the target brain tumors and achieved an accuracy of 98% and an area under the receiver operating characteristics curve (Az) of 0.99. Without pre-trained features, only 61% of accuracy was obtained. The proposed convolutional neural network can accurately and efficiently classify grade 2, 3, and 4 gliomas. The promising accuracy can provide diagnostic suggestions to radiologists in the clinic.

Keywords: convolutional neural network, computer-aided diagnosis, glioblastoma, magnetic resonance imaging

Procedia PDF Downloads 125
2542 Optimizing Detection Methods for THz Bio-imaging Applications

Authors: C. Bolakis, I. S. Karanasiou, D. Grbovic, G. Karunasiri, N. Uzunoglu

Abstract:

A new approach for efficient detection of THz radiation in biomedical imaging applications is proposed. A double-layered absorber consisting of a 32 nm thick aluminum (Al) metallic layer, located on a glass medium (SiO2) of 1 mm thickness, was fabricated and used to design a fine-tuned absorber through a theoretical and finite element modeling process. The results indicate that the proposed low-cost, double-layered absorber can be tuned based on the metal layer sheet resistance and the thickness of various glass media taking advantage of the diversity of the absorption of the metal films in the desired THz domain (6 to 10 THz). It was found that the composite absorber could absorb up to 86% (a percentage exceeding the 50%, previously shown to be the highest achievable when using single thin metal layer) and reflect less than 1% of the incident THz power. This approach will enable monitoring of the transmission coefficient (THz transmission ‘’fingerprint’’) of the biosample with high accuracy, while also making the proposed double-layered absorber a good candidate for a microbolometer pixel’s active element. Based on the aforementioned promising results, a more sophisticated and effective double-layered absorber is under development. The glass medium has been substituted by diluted poly-si and the results were twofold: An absorption factor of 96% was reached and high TCR properties acquired. In addition, a generalization of these results and properties over the active frequency spectrum was achieved. Specifically, through the development of a theoretical equation having as input any arbitrary frequency in the IR spectrum (0.3 to 405.4 THz) and as output the appropriate thickness of the poly-si medium, the double-layered absorber retains the ability to absorb the 96% and reflects less than 1% of the incident power. As a result, through that post-optimization process and the spread spectrum frequency adjustment, the microbolometer detector efficiency could be further improved.

Keywords: bio-imaging, fine-tuned absorber, fingerprint, microbolometer

Procedia PDF Downloads 331
2541 Feasibility of Washing/Extraction Treatment for the Remediation of Deep-Sea Mining Trailings

Authors: Kyoungrean Kim

Abstract:

Importance of deep-sea mineral resources is dramatically increasing due to the depletion of land mineral resources corresponding to increasing human’s economic activities. Korea has acquired exclusive exploration licenses at four areas which are the Clarion-Clipperton Fracture Zone in the Pacific Ocean (2002), Tonga (2008), Fiji (2011) and Indian Ocean (2014). The preparation for commercial mining of Nautilus minerals (Canada) and Lockheed martin minerals (USA) is expected by 2020. The London Protocol 1996 (LP) under International Maritime Organization (IMO) and International Seabed Authority (ISA) will set environmental guidelines for deep-sea mining until 2020, to protect marine environment. In this research, the applicability of washing/extraction treatment for the remediation of deep-sea mining tailings was mainly evaluated in order to present preliminary data to develop practical remediation technology in near future. Polymetallic nodule samples were collected at the Clarion-Clipperton Fracture Zone in the Pacific Ocean, then stored at room temperature. Samples were pulverized by using jaw crusher and ball mill then, classified into 3 particle sizes (> 63 µm, 63-20 µm, < 20 µm) by using vibratory sieve shakers (Analysette 3 Pro, Fritsch, Germany) with 63 µm and 20 µm sieve. Only the particle size 63-20 µm was used as the samples for investigation considering the lower limit of ore dressing process which is tens to 100 µm. Rhamnolipid and sodium alginate as biosurfactant and aluminum sulfate which are mainly used as flocculant were used as environmentally friendly additives. Samples were adjusted to 2% liquid with deionized water then mixed with various concentrations of additives. The mixture was stirred with a magnetic bar during specific reaction times and then the liquid phase was separated by a centrifugal separator (Thermo Fisher Scientific, USA) under 4,000 rpm for 1 h. The separated liquid was filtered with a syringe and acrylic-based filter (0.45 µm). The extracted heavy metals in the filtered liquid were then determined using a UV-Vis spectrometer (DR-5000, Hach, USA) and a heat block (DBR 200, Hach, USA) followed by US EPA methods (8506, 8009, 10217 and 10220). Polymetallic nodule was mainly composed of manganese (27%), iron (8%), nickel (1.4%), cupper (1.3 %), cobalt (1.3%) and molybdenum (0.04%). Based on remediation standards of various countries, Nickel (Ni), Copper (Cu), Cadmium (Cd) and Zinc (Zn) were selected as primary target materials. Throughout this research, the use of rhamnolipid was shown to be an effective approach for removing heavy metals in samples originated from manganese nodules. Sodium alginate might also be one of the effective additives for the remediation of deep-sea mining tailings such as polymetallic nodules. Compare to the use of rhamnolipid and sodium alginate, aluminum sulfate was more effective additive at short reaction time within 4 h. Based on these results, sequencing particle separation, selective extraction/washing, advanced filtration of liquid phase, water treatment without dewatering and solidification/stabilization may be considered as candidate technologies for the remediation of deep-sea mining tailings.

Keywords: deep-sea mining tailings, heavy metals, remediation, extraction, additives

Procedia PDF Downloads 141
2540 Evaluating the Use of Manned and Unmanned Aerial Vehicles in Strategic Offensive Tasks

Authors: Yildiray Korkmaz, Mehmet Aksoy

Abstract:

In today's operations, countries want to reach their aims in the shortest way due to economical, political and humanitarian aspects. The most effective way of achieving this goal is to be able to penetrate strategic targets. Strategic targets are generally located deep inside of the countries and are defended by modern and efficient surface to air missiles (SAM) platforms which are operated as integrated with Intelligence, Surveillance and Reconnaissance (ISR) systems. On the other hand, these high valued targets are buried deep underground and hardened with strong materials against attacks. Therefore, to penetrate these targets requires very detailed intelligence. This intelligence process should include a wide range that is from weaponry to threat assessment. Accordingly, the framework of the attack package will be determined. This mission package has to execute missions in a high threat environment. The way to minimize the risk which depends on loss of life is to use packages which are formed by UAVs. However, some limitations arising from the characteristics of UAVs restricts the performance of the mission package consisted of UAVs. So, the mission package should be formed with UAVs under the leadership of a fifth generation manned aircraft. Thus, we can minimize the limitations, easily penetrate in the deep inside of the enemy territory with minimum risk, make a decision according to ever-changing conditions and finally destroy the strategic targets. In this article, the strengthens and weakness aspects of UAVs are examined by SWOT analysis. And also, it revealed features of a mission package and presented as an example what kind of a mission package we should form in order to get marginal benefit and penetrate into strategic targets with the development of autonomous mission execution capability in the near future.

Keywords: UAV, autonomy, mission package, strategic attack, mission planning

Procedia PDF Downloads 532
2539 Emotional Labor Strategies and Intentions to Quit among Nurses in Pakistan

Authors: Maham Malik, Amjad Ali, Muhammad Asif

Abstract:

Current study aims to examine the relationship of emotional labor strategies - deep acting and surface acting - with employees' job satisfaction, organizational commitment and intentions to quit. The study also examines the mediating role of job satisfaction and organizational commitment for relationship of emotional labor strategies with intentions to quit. Data were conveniently collected from 307 nurses by using self-administered questionnaire. Linear regression test was applied to find the relationship between the variables. Mediation was checked through Baron and Kenny Model and Sobel test. Results prove the existence of partial mediation of job satisfaction between the emotional labor strategies and quitting intentions. The study recommends that deep acting should be promoted because it is positively associated with quality of work life, work engagement and organizational citizenship behavior of employees.

Keywords: emotional labor strategies, intentions to quit, job satisfaction, organizational commitment, nursing

Procedia PDF Downloads 126
2538 Discovering User Behaviour Patterns from Web Log Analysis to Enhance the Accessibility and Usability of Website

Authors: Harpreet Singh

Abstract:

Finding relevant information on the World Wide Web is becoming highly challenging day by day. Web usage mining is used for the extraction of relevant and useful knowledge, such as user behaviour patterns, from web access log records. Web access log records all the requests for individual files that the users have requested from the website. Web usage mining is important for Customer Relationship Management (CRM), as it can ensure customer satisfaction as far as the interaction between the customer and the organization is concerned. Web usage mining is helpful in improving website structure or design as per the user’s requirement by analyzing the access log file of a website through a log analyzer tool. The focus of this paper is to enhance the accessibility and usability of a guitar selling web site by analyzing their access log through Deep Log Analyzer tool. The results show that the maximum number of users is from the United States and that they use Opera 9.8 web browser and the Windows XP operating system.

Keywords: web usage mining, web mining, log file, data mining, deep log analyzer

Procedia PDF Downloads 234
2537 A Dynamic Cardiac Single Photon Emission Computer Tomography Using Conventional Gamma Camera to Estimate Coronary Flow Reserve

Authors: Maria Sciammarella, Uttam M. Shrestha, Youngho Seo, Grant T. Gullberg, Elias H. Botvinick

Abstract:

Background: Myocardial perfusion imaging (MPI) is typically performed with static imaging protocols and visually assessed for perfusion defects based on the relative intensity distribution. Dynamic cardiac SPECT, on the other hand, is a new imaging technique that is based on time varying information of radiotracer distribution, which permits quantification of myocardial blood flow (MBF). In this abstract, we report a progress and current status of dynamic cardiac SPECT using conventional gamma camera (Infinia Hawkeye 4, GE Healthcare) for estimation of myocardial blood flow and coronary flow reserve. Methods: A group of patients who had high risk of coronary artery disease was enrolled to evaluate our methodology. A low-dose/high-dose rest/pharmacologic-induced-stress protocol was implemented. A standard rest and a standard stress radionuclide dose of ⁹⁹ᵐTc-tetrofosmin (140 keV) was administered. The dynamic SPECT data for each patient were reconstructed using the standard 4-dimensional maximum likelihood expectation maximization (ML-EM) algorithm. Acquired data were used to estimate the myocardial blood flow (MBF). The correspondence between flow values in the main coronary vasculature with myocardial segments defined by the standardized myocardial segmentation and nomenclature were derived. The coronary flow reserve, CFR, was defined as the ratio of stress to rest MBF values. CFR values estimated with SPECT were also validated with dynamic PET. Results: The range of territorial MBF in LAD, RCA, and LCX was 0.44 ml/min/g to 3.81 ml/min/g. The MBF between estimated with PET and SPECT in the group of independent cohort of 7 patients showed statistically significant correlation, r = 0.71 (p < 0.001). But the corresponding CFR correlation was moderate r = 0.39 yet statistically significant (p = 0.037). The mean stress MBF value was significantly lower for angiographically abnormal than that for the normal (Normal Mean MBF = 2.49 ± 0.61, Abnormal Mean MBF = 1.43 ± 0. 0.62, P < .001). Conclusions: The visually assessed image findings in clinical SPECT are subjective, and may not reflect direct physiologic measures of coronary lesion. The MBF and CFR measured with dynamic SPECT are fully objective and available only with the data generated from the dynamic SPECT method. A quantitative approach such as measuring CFR using dynamic SPECT imaging is a better mode of diagnosing CAD than visual assessment of stress and rest images from static SPECT images Coronary Flow Reserve.

Keywords: dynamic SPECT, clinical SPECT/CT, selective coronary angiograph, ⁹⁹ᵐTc-Tetrofosmin

Procedia PDF Downloads 137
2536 Assisting Dating of Greek Papyri Images with Deep Learning

Authors: Asimina Paparrigopoulou, John Pavlopoulos, Maria Konstantinidou

Abstract:

Dating papyri accurately is crucial not only to editing their texts but also for our understanding of palaeography and the history of writing, ancient scholarship, material culture, networks in antiquity, etc. Most ancient manuscripts offer little evidence regarding the time of their production, forcing papyrologists to date them on palaeographical grounds, a method often criticized for its subjectivity. By experimenting with data obtained from the Collaborative Database of Dateable Greek Bookhands and the PapPal online collections of objectively dated Greek papyri, this study shows that deep learning dating models, pre-trained on generic images, can achieve accurate chronological estimates for a test subset (67,97% accuracy for book hands and 55,25% for documents). To compare the estimates of these models with those of humans, experts were asked to complete a questionnaire with samples of literary and documentary hands that had to be sorted chronologically by century. The same samples were dated by the models in question. The results are presented and analysed.

Keywords: image classification, papyri images, dating

Procedia PDF Downloads 62
2535 Classification of IoT Traffic Security Attacks Using Deep Learning

Authors: Anum Ali, Kashaf ad Dooja, Asif Saleem

Abstract:

The future smart cities trend will be towards Internet of Things (IoT); IoT creates dynamic connections in a ubiquitous manner. Smart cities offer ease and flexibility for daily life matters. By using small devices that are connected to cloud servers based on IoT, network traffic between these devices is growing exponentially, whose security is a concerned issue, since ratio of cyber attack may make the network traffic vulnerable. This paper discusses the latest machine learning approaches in related work further to tackle the increasing rate of cyber attacks, machine learning algorithm is applied to IoT-based network traffic data. The proposed algorithm train itself on data and identify different sections of devices interaction by using supervised learning which is considered as a classifier related to a specific IoT device class. The simulation results clearly identify the attacks and produce fewer false detections.

Keywords: IoT, traffic security, deep learning, classification

Procedia PDF Downloads 132
2534 Non-Invasive Imaging of Tissue Using Near Infrared Radiations

Authors: Ashwani Kumar Aggarwal

Abstract:

NIR Light is non-ionizing and can pass easily through living tissues such as breast without any harmful effects. Therefore, use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function. This blurred reconstructed image has been enhanced using a digital filter which is optimal in mean square sense.

Keywords: least-squares optimization, filtering, tomography, laser interaction, light scattering

Procedia PDF Downloads 299
2533 Income and Factor Analysis of Small Scale Broiler Production in Imo State, Nigeria

Authors: Ubon Asuquo Essien, Okwudili Bismark Ibeagwa, Daberechi Peace Ubabuko

Abstract:

The Broiler Poultry subsector is dominated by small scale production with low aggregate output. The high cost of inputs currently experienced in Nigeria tends to aggravate the situation; hence many broiler farmers struggle to break-even. This study was designed to examine income and input factors in small scale deep liter broiler production in Imo state, Nigeria. Specifically, the study examined; socio-economic characteristics of small scale deep liter broiler producing Poultry farmers; estimate cost and returns of broiler production in the area; analyze input factors in broiler production in the area and examined marketability, age and profitability of the enterprise. A multi-stage sampling technique was adopted in selecting 60 small scale broiler farmers who use deep liter system from 6 communities through the use of structured questionnaire. The socioeconomic characteristics of the broiler farmers and the profitability/ marketability age of the birds were described using descriptive statistical tools such as frequencies, means and percentages. Gross margin analysis was used to analyze the cost and returns to broiler production, while Cobb Douglas production function was employed to analyze input factors in broiler production. The result of the study revealed that the cost of feed (P<0.1), deep liter material (P<0.05) and medication (P<0.05) had a significant positive relationship with the gross return of broiler farmers in the study area, while cost of labour, fuel and day old chicks were not significant. Furthermore, Gross profit margin of the farmers who market their broiler at the 8th week of rearing was 80.7%; and 78.7% and 60.8% for farmers who market at the 10th week and 12th week of rearing, respectively. The business is, therefore, profitable but at varying degree. Government and Development partners should make deliberate efforts to curb the current rise in the prices of poultry feeds, drugs and timber materials used as bedding so as to widen the profit margin and encourage more farmers to go into the business. The farmers equally need more technical assistance from extension agents with regards to timely and profitable marketing.

Keywords: broilers, factor analysis, income, small scale

Procedia PDF Downloads 53
2532 FMR1 Gene Carrier Screening for Premature Ovarian Insufficiency in Females: An Indian Scenario

Authors: Sarita Agarwal, Deepika Delsa Dean

Abstract:

Like the task of transferring photo images to artistic images, image-to-image translation aims to translate the data to the imitated data which belongs to the target domain. Neural Style Transfer and CycleGAN are two well-known deep learning architectures used for photo image-to-art image transfer. However, studies involving these two models concentrate on one-to-one domain translation, not one-to-multi domains translation. Our study tries to investigate deep learning architectures, which can be controlled to yield multiple artistic style translation only by adding a conditional vector. We have expanded CycleGAN and constructed Conditional CycleGAN for 5 kinds of categories translation. Our study found that the architecture inserting conditional vector into the middle layer of the Generator could output multiple artistic images.

Keywords: genetic counseling, FMR1 gene, fragile x-associated primary ovarian insufficiency, premutation

Procedia PDF Downloads 107
2531 Optimizing Privacy, Accuracy and Calibration in Deep Learning Models

Authors: Rizwan Rizwan

Abstract:

Differentially private ({DP}) training preserves the data privacy but often leads to slower convergence and lower accuracy, along with notable mis-calibration compared to non-private training. Analyzing {DP} training through a continuous-time approach with the neural tangent kernel ({NTK}). The {NTK} helps characterize per sample {(PS)} gradient clipping and the incorporation of noise during {DP} training across arbitrary network architectures as well as loss functions. Our analysis reveals that noise addition impacts privacy risk exclusively, leaving convergence and calibration unaffected. In contrast, {PS} gradient clipping (flat styles, layerwise styles) influences convergence as well as calibration but not privacy risk. Models with a small clipping norm generally achieve optimal accuracy but exhibit poor calibration, making them less reliable. Conversely, {DP} models that are trained with a large clipping norm maintain the similar accuracy and same privacy guarantee, yet they demonstrate notably improved calibration.

Keywords: deep learning, convergence, differential privacy, calibration

Procedia PDF Downloads 20
2530 Advanced Particle Characterisation of Suspended Sediment in the Danube River Using Automated Imaging and Laser Diffraction

Authors: Flóra Pomázi, Sándor Baranya, Zoltán Szalai

Abstract:

A harmonized monitoring of the suspended sediment transport along such a large river as the world’s most international river, the Danube River, is a rather challenging task. The traditional monitoring method in Hungary is obsolete but using indirect measurement devices and techniques like optical backscatter sensors (OBS), laser diffraction or acoustic backscatter sensors (ABS) could provide a fast and efficient alternative option of direct methods. However, these methods are strongly sensitive to the particle characteristics (i.e. particle shape, particle size and mineral composition). The current method does not provide sufficient information about particle size distribution, mineral analysis is rarely done, and the shape of the suspended sediment particles have not been examined yet. The aims of the study are (1) to determine the particle characterisation of suspended sediment in the Danube River using advanced particle characterisation methods as laser diffraction and automated imaging, and (2) to perform a sensitivity analysis of the indirect methods in order to determine the impact of suspended particle characteristics. The particle size distribution is determined by laser diffraction. The particle shape and mineral composition analysis is done by the Morphologi G3ID image analyser. The investigated indirect measurement devices are the LISST-Portable|XR, the LISST-ABS (Sequoia Inc.) and the Rio Grande 1200 kHz ADCP (Teledyne Marine). The major findings of this study are (1) the statistical shape of the suspended sediment particle - this is the first research in this context, (2) the actualised particle size distribution – that can be compared to historical information, so that the morphological changes can be tracked, (3) the actual mineral composition of the suspended sediment in the Danube River, and (4) the reliability of the tested indirect methods has been increased – based on the results of the sensitivity analysis and the previous findings.

Keywords: advanced particle characterisation, automated imaging, indirect methods, laser diffraction, mineral composition, suspended sediment

Procedia PDF Downloads 128
2529 The Twelfth Rib as a Landmark for Surgery

Authors: Jake Tempo, Georgina Williams, Iain Robertson, Claire Pascoe, Darren Rama, Richard Cetti

Abstract:

Introduction: The twelfth rib is commonly used as a landmark for surgery; however, its variability in length has not been formally studied. The highly variable rib length provides a challenge for urologists seeking a consistent landmark for percutaneous nephrolithotomy and retroperitoneoscopic surgery. Methods and materials: We analysed CT scans of 100 adults who had imaging between 23rd March and twelfth April 2020 at an Australian Hospital. We measured the distance from the mid-sagittal line to the twelfth rib tip in the axial plane as a surrogate for true rib length. We also measured the distance from the twelfth rib tip to the kidney, spleen, and liver. Results: Length from the mid-sagittal line to the right twelfth rib tip varied from 46 (percentile 95%CI 40 to 57) to 136mm (percentile 95%CI 133 to 138). On the left, the distances varied from 55 (percentile 95%CI 50 to 64) to 134mm (percentile 95%CI 131 to 135). Twenty-three percent of people had an organ lying between the tip of the twelfth rib and the kidney on the right, and 11% of people had the same finding on the left. Conclusion: The twelfth rib is highly variable in its length. Similar variability was recorded in the distance from the tip to intra-abdominal organs. Due to the frequency of organs lying between the tip of the rib and the kidney, it should not be used as a landmark for accessing the kidney without prior knowledge of an individual patient’s anatomy, as seen on imaging.

Keywords: PCNL, rib, anatomy, nephrolithotomy

Procedia PDF Downloads 91
2528 Long-Term Conservation Tillage Impact on Soil Properties and Crop Productivity

Authors: Danute Karcauskiene, Dalia Ambrazaitiene, Regina Skuodiene, Monika Vilkiene, Regina Repsiene, Ieva Jokubauskaite

Abstract:

The main ambition for nowadays agriculture is to get the economically effective yield and to secure the soil ecological sustainability. According to the effect on the main soil quality indexes, tillage systems may be separated into two types, conventional and conservation tillage. The goal of this study was to determine the impact of conservation and conventional primary soil tillage methods and soil fertility improvement measures on soil properties and crop productivity. Methods: The soil of the experimental site is Dystric Glossic Retisol (WRB 2014) with texture of sandy loam. The trial was established in 2003 in the experimental field of crop rotation of Vėžaičiai Branch of Lithuanian Research Centre for Agriculture and Forestry. Trial factors and treatments: factor A- primary soil tillage in (autumn): deep ploughing (20-25cm), shallow ploughing (10-12cm), shallow ploughless tillage (8-10cm); factor B – soil fertility improvement measures: plant residues, plant residues + straw, green manure 1st cut + straw, farmyard manure 40tha-1 + straw. The four - course crop rotation consisted of red clover, winter wheat, spring rape and spring barley with undersown. Results: The tillage had no statistically significant effect on topsoil (0-10 cm) pHKCl level, it was 5.5 - 5.7. During all experiment period, the highest soil pHKCl level (5.65) was in the shallow ploughless tillage. The organic fertilizers particularly the biomass of grass and farmyard manure had tendency to increase the soil pHKCl. The content of plant - available phosphorus and potassium significantly increase in the shallow ploughing compared with others tillage systems. The farmyard manure increases those elements in whole arable layer. The dissolved organic carbon concentration was significantly higher in the 0 - 10 cm soil layer in the shallow ploughless tillage compared with deep ploughing. After the incorporation of clover biomass and farmyard manure the concentration of dissolved organic carbon increased in the top soil layer. During all experiment period the largest amount of water stable aggregates was determined in the soil where the shallow ploughless tillage was applied. It was by 12% higher compared with deep ploughing. During all experiment time, the soil moisture was higher in the shallow ploughing and shallow ploughless tillage (9-27%) compared to deep ploughing. The lowest emission of CO2 was determined in the deep ploughing soil. The highest rate of CO2 emission was in shallow ploughless tillage. The addition of organic fertilisers had a tendency to increase the CO2 emission, but there was no statistically significant effect between the different types of organic fertilisers. The crop yield was larger in the deep ploughing soil compared to the shallow and shallow ploughless tillage.

Keywords: reduced tillage, soil structure, soil pH, biological activity, crop productivity

Procedia PDF Downloads 246
2527 DEEPMOTILE: Motility Analysis of Human Spermatozoa Using Deep Learning in Sri Lankan Population

Authors: Chamika Chiran Perera, Dananjaya Perera, Chirath Dasanayake, Banuka Athuraliya

Abstract:

Male infertility is a major problem in the world, and it is a neglected and sensitive health issue in Sri Lanka. It can be determined by analyzing human semen samples. Sperm motility is one of many factors that can evaluate male’s fertility potential. In Sri Lanka, this analysis is performed manually. Manual methods are time consuming and depend on the person, but they are reliable and it can depend on the expert. Machine learning and deep learning technologies are currently being investigated to automate the spermatozoa motility analysis, and these methods are unreliable. These automatic methods tend to produce false positive results and false detection. Current automatic methods support different techniques, and some of them are very expensive. Due to the geographical variance in spermatozoa characteristics, current automatic methods are not reliable for motility analysis in Sri Lanka. The suggested system, DeepMotile, is to explore a method to analyze motility of human spermatozoa automatically and present it to the andrology laboratories to overcome current issues. DeepMotile is a novel deep learning method for analyzing spermatozoa motility parameters in the Sri Lankan population. To implement the current approach, Sri Lanka patient data were collected anonymously as a dataset, and glass slides were used as a low-cost technique to analyze semen samples. Current problem was identified as microscopic object detection and tackling the problem. YOLOv5 was customized and used as the object detector, and it achieved 94 % mAP (mean average precision), 86% Precision, and 90% Recall with the gathered dataset. StrongSORT was used as the object tracker, and it was validated with andrology experts due to the unavailability of annotated ground truth data. Furthermore, this research has identified many potential ways for further investigation, and andrology experts can use this system to analyze motility parameters with realistic accuracy.

Keywords: computer vision, deep learning, convolutional neural networks, multi-target tracking, microscopic object detection and tracking, male infertility detection, motility analysis of human spermatozoa

Procedia PDF Downloads 88