Search results for: automated imaging
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2011

Search results for: automated imaging

1801 Evaluation of Fetal brain using Magnetic Resonance Imaging

Authors: Mahdi Farajzadeh Ajirlou

Abstract:

Ordinary fetal brain development can be considered by in vivo attractive reverberation imaging (MRI) from the 18th gestational week (GW) to term and depends fundamentally on T2-weighted and diffusion-weighted (DW) arrangements. The foremost commonly suspected brain pathologies alluded to fetal MRI for assist assessment are ventriculomegaly, lost corpus callosum, and anomalies of the posterior fossa. Brain division could be a crucial to begin with step in neuroimage examination. Within the case of fetal MRI it is especially challenging and critical due to the subjective introduction of the hatchling, organs that encompass the fetal head, and irregular fetal movement. A few promising strategies have been proposed but are constrained in their execution in challenging cases and in realtime division. Fetal MRI is routinely performed on a 1.5-Tesla scanner without maternal or fetal sedation. The mother lies recumbent amid the course of the examination, the length of which is ordinarily 45 to 60 minutes. The accessibility and continuous approval of standardizing fetal brain development directions will give critical devices for early discovery of impeded fetal brain development upon which to oversee high-risk pregnancies.

Keywords: brain, fetal, MRI, imaging

Procedia PDF Downloads 44
1800 Analysis of Splicing Methods for High Speed Automated Fibre Placement Applications

Authors: Phillip Kearney, Constantina Lekakou, Stephen Belcher, Alessandro Sordon

Abstract:

The focus in the automotive industry is to reduce human operator and machine interaction, so manufacturing becomes more automated and safer. The aim is to lower part cost and construction time as well as defects in the parts, sometimes occurring due to the physical limitations of human operators. A move to automate the layup of reinforcement material in composites manufacturing has resulted in the use of tapes that are placed in position by a robotic deposition head, also described as Automated Fibre Placement (AFP). The process of AFP is limited with respect to the finite amount of material that can be loaded into the machine at any one time. Joining two batches of tape material together involves a splice to secure the ends of the finishing tape to the starting edge of the new tape. The splicing method of choice for the majority of prepreg applications is a hand stich method, and as the name suggests requires human input to achieve. This investigation explores three methods for automated splicing, namely, adhesive, binding and stitching. The adhesive technique uses an additional adhesive placed on the tape ends to be joined. Binding uses the binding agent that is already impregnated onto the tape through the application of heat. The stitching method is used as a baseline to compare the new splicing methods to the traditional technique currently in use. As the methods will be used within a High Speed Automated Fibre Placement (HSAFP) process, this meant the parameters of the splices have to meet certain specifications: (a) the splice must be able to endure a load of 50 N in tension applied at a rate of 1 mm/s; (b) the splice must be created in less than 6 seconds, dictated by the capacity of the tape accumulator within the system. The samples for experimentation were manufactured with controlled overlaps, alignment and splicing parameters, these were then tested in tension using a tensile testing machine. Initial analysis explored the use of the impregnated binding agent present on the tape, as in the binding splicing technique. It analysed the effect of temperature and overlap on the strength of the splice. It was found that the optimum splicing temperature was at the higher end of the activation range of the binding agent, 100 °C. The optimum overlap was found to be 25 mm; it was found that there was no improvement in bond strength from 25 mm to 30 mm overlap. The final analysis compared the different splicing methods to the baseline of a stitched bond. It was found that the addition of an adhesive was the best splicing method, achieving a maximum load of over 500 N compared to the 26 N load achieved by a stitching splice and 94 N by the binding method.

Keywords: analysis, automated fibre placement, high speed, splicing

Procedia PDF Downloads 117
1799 Role of Natural Language Processing in Information Retrieval; Challenges and Opportunities

Authors: Khaled M. Alhawiti

Abstract:

This paper aims to analyze the role of natural language processing (NLP). The paper will discuss the role in the context of automated data retrieval, automated question answer, and text structuring. NLP techniques are gaining wider acceptance in real life applications and industrial concerns. There are various complexities involved in processing the text of natural language that could satisfy the need of decision makers. This paper begins with the description of the qualities of NLP practices. The paper then focuses on the challenges in natural language processing. The paper also discusses major techniques of NLP. The last section describes opportunities and challenges for future research.

Keywords: data retrieval, information retrieval, natural language processing, text structuring

Procedia PDF Downloads 312
1798 Determination of Neighbor Node in Consideration of the Imaging Range of Cameras in Automatic Human Tracking System

Authors: Kozo Tanigawa, Tappei Yotsumoto, Kenichi Takahashi, Takao Kawamura, Kazunori Sugahara

Abstract:

An automatic human tracking system using mobile agent technology is realized because a mobile agent moves in accordance with a migration of a target person. In this paper, we propose a method for determining the neighbor node in consideration of the imaging range of cameras.

Keywords: human tracking, mobile agent, Pan/Tilt/Zoom, neighbor relation

Procedia PDF Downloads 475
1797 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia

Authors: Carol Anne Hargreaves

Abstract:

A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.

Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system

Procedia PDF Downloads 128
1796 Friction Estimation and Compensation for Steering Angle Control for Highly Automated Driving

Authors: Marcus Walter, Norbert Nitzsche, Dirk Odenthal, Steffen Müller

Abstract:

This contribution presents a friction estimator for industrial purposes which identifies Coulomb friction in a steering system. The estimator only needs a few, usually known, steering system parameters. Friction occurs on almost every mechanical system and has a negative influence on high-precision position control. This is demonstrated on a steering angle controller for highly automated driving. In this steering system the friction induces limit cycles which cause oscillating vehicle movement when the vehicle follows a given reference trajectory. When compensating the friction with the introduced estimator, limit cycles can be suppressed. This is demonstrated by measurements in a series vehicle.

Keywords: friction estimation, friction compensation, steering system, lateral vehicle guidance

Procedia PDF Downloads 484
1795 An Encapsulation of a Navigable Tree Position: Theory, Specification, and Verification

Authors: Nicodemus M. J. Mbwambo, Yu-Shan Sun, Murali Sitaraman, Joan Krone

Abstract:

This paper presents a generic data abstraction that captures a navigable tree position. The mathematical modeling of the abstraction encapsulates the current tree position, which can be used to navigate and modify the tree. The encapsulation of the tree position in the data abstraction specification avoids the use of explicit references and aliasing, thereby simplifying verification of (imperative) client code that uses the data abstraction. To ease the tasks of such specification and verification, a general tree theory, rich with mathematical notations and results, has been developed. The paper contains an example to illustrate automated verification ramifications. With sufficient tree theory development, automated proving seems plausible even in the absence of a special-purpose tree solver.

Keywords: automation, data abstraction, maps, specification, tree, verification

Procedia PDF Downloads 134
1794 Pre-Analysis of Printed Circuit Boards Based on Multispectral Imaging for Vision Based Recognition of Electronics Waste

Authors: Florian Kleber, Martin Kampel

Abstract:

The increasing demand of gallium, indium and rare-earth elements for the production of electronics, e.g. solid state-lighting, photovoltaics, integrated circuits, and liquid crystal displays, will exceed the world-wide supply according to current forecasts. Recycling systems to reclaim these materials are not yet in place, which challenges the sustainability of these technologies. This paper proposes a multispectral imaging system as a basis for a vision based recognition system for valuable components of electronics waste. Multispectral images intend to enhance the contrast of images of printed circuit boards (single components, as well as labels) for further analysis, such as optical character recognition and entire printed circuit board recognition. The results show that a higher contrast is achieved in the near infrared compared to ultraviolet and visible light.

Keywords: electronics waste, multispectral imaging, printed circuit boards, rare-earth elements

Procedia PDF Downloads 392
1793 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 211
1792 Seismic Protection of Automated Stocker System by Customized Viscous Fluid Dampers

Authors: Y. P. Wang, J. K. Chen, C. H. Lee, G. H. Huang, M. C. Wang, S. W. Chen, Y. T. Kuan, H. C. Lin, C. Y. Huang, W. H. Liang, W. C. Lin, H. C. Yu

Abstract:

The hi-tech industries in the Science Park at southern Taiwan were heavily damaged by a strong earthquake early 2016. The financial loss in this event was attributed primarily to the automated stocker system handling fully processed products, and recovery of the automated stocker system from the aftermath proved to contribute major lead time. Therefore, development of effective means for protection of stockers against earthquakes has become the highest priority for risk minimization and business continuity. This study proposes to mitigate the seismic response of the stockers by introducing viscous fluid dampers in between the ceiling and the top of the stockers. The stocker is expected to vibrate less violently with a passive control force on top. Linear damper is considered in this application with an optimal damping coefficient determined from a preliminary parametric study. The damper is small in size in comparison with those adopted for building or bridge applications. Component test of the dampers has been carried out to make sure they meet the design requirement. Shake table tests have been further conducted to verify the proposed scheme under realistic earthquake conditions. Encouraging results have been achieved by effectively reducing the seismic responses of up to 60% and preventing the FOUPs from falling off the shelves that would otherwise be the case if left unprotected. Effectiveness of adopting a viscous fluid damper for seismic control of the stocker on top against the ceiling has been confirmed. This technique has been adopted by Macronix International Co., LTD for seismic retrofit of existing stockers. Demonstrative projects on the application of the proposed technique are planned underway for other companies in the display industry as well.

Keywords: hi-tech industries, seismic protection, automated stocker system, viscous fluid damper

Procedia PDF Downloads 330
1791 Comparison of the Chest X-Ray and Computerized Tomography Scans Requested from the Emergency Department

Authors: Sahabettin Mete, Abdullah C. Hocagil, Hilal Hocagil, Volkan Ulker, Hasan C. Taskin

Abstract:

Objectives and Goals: An emergency department is a place where people can come for a multitude of reasons 24 hours a day. As it is an easy, accessible place, thanks to self-sacrificing people who work in emergency departments. But the workload and overcrowding of emergency departments are increasing day by day. Under these circumstances, it is important to choose a quick, easily accessible and effective test for diagnosis. This results in laboratory and imaging tests being more than 40% of all emergency department costs. Despite all of the technological advances in imaging methods and available computerized tomography (CT), chest X-ray, the older imaging method, has not lost its appeal and effectiveness for nearly all emergency physicians. Progress in imaging methods are very convenient, but physicians should consider the radiation dose, cost, and effectiveness, as well as imaging methods to be carefully selected and used. The aim of the study was to investigate the effectiveness of chest X-ray in immediate diagnosis against the advancing technology by comparing chest X-ray and chest CT scan results of the patients in the emergency department. Methods: Patients who applied to Bulent Ecevit University Faculty of Medicine’s emergency department were investigated retrospectively in between 1 September 2014 and 28 February 2015. Data were obtained via MIAMED (Clear Canvas Image Server v6.2, Toronto, Canada), information management system which patients’ files are saved electronically in the clinic, and were retrospectively scanned. The study included 199 patients who were 18 or older, had both chest X-ray and chest CT imaging. Chest X-ray images were evaluated by the emergency medicine senior assistant in the emergency department, and the findings were saved to the study form. CT findings were obtained from already reported data by radiology department in the clinic. Chest X-ray was evaluated with seven questions in terms of technique and dose adequacy. Patients’ age, gender, application complaints, comorbid diseases, vital signs, physical examination findings, diagnosis, chest X-ray findings and chest CT findings were evaluated. Data saved and statistical analyses have made via using SPSS 19.0 for Windows. And the value of p < 0.05 were accepted statistically significant. Results: 199 patients were included in the study. In 38,2% (n=76) of all patients were diagnosed with pneumonia and it was the most common diagnosis. The chest X-ray imaging technique was appropriate in patients with the rate of 31% (n=62) of all patients. There was not any statistically significant difference (p > 0.05) between both imaging methods (chest X-ray and chest CT) in terms of determining the rates of displacement of the trachea, pneumothorax, parenchymal consolidation, increased cardiothoracic ratio, lymphadenopathy, diaphragmatic hernia, free air levels in the abdomen (in sections including the image), pleural thickening, parenchymal cyst, parenchymal mass, parenchymal cavity, parenchymal atelectasis and bone fractures. Conclusions: When imaging findings, showing cases that needed to be quickly diagnosed, were investigated, chest X-ray and chest CT findings were matched at a high rate in patients with an appropriate imaging technique. However, chest X-rays, evaluated in the emergency department, were frequently taken with an inappropriate technique.

Keywords: chest x-ray, chest computerized tomography, chest imaging, emergency department

Procedia PDF Downloads 156
1790 RoboWeedSupport-Semi-Automated Unmanned Aerial System for Cost Efficient High Resolution in Sub-Millimeter Scale Acquisition of Weed Images

Authors: Simon L. Madsen, Mads Dyrmann, Morten S. Laursen, Rasmus N. Jørgensen

Abstract:

Recent advances in the Unmanned Aerial System (UAS) safety and perception systems enable safe low altitude autonomous terrain following flights recently demonstrated by the consumer DJI Mavic PRO and Phamtom 4 Pro drones. This paper presents the first prototype system utilizing this functionality in form of semi-automated UAS based collection of crop/weed images where the embedded perception system ensures a significantly safer and faster gathering of weed images with sub-millimeter resolution. The system is to be used when the weeds are at cotyledon stage and prior to the harvest recognizing the grass weed species, which cannot be discriminated at the cotyledon stage.

Keywords: weed mapping, UAV, DJI SDK, automation, cotyledon plants

Procedia PDF Downloads 280
1789 C-Spine Imaging in a Non-trauma Centre: Compliance with NEXUS Criteria Audit

Authors: Andrew White, Abigail Lowe, Kory Watkins, Hamed Akhlaghi, Nicole Winter

Abstract:

The timing and appropriateness of diagnostic imaging are critical to the evaluation and management of traumatic injuries. Within the subclass of trauma patients, the prevalence of c-spine injury is less than 4%. However, the incidence of delayed diagnosis within this cohort has been documented as up to 20%, with inadequate radiological examination most cited issue. In order to assess those in which c-spine injury cannot be fully excluded based on clinical examination alone and, therefore, should undergo diagnostic imaging, a set of criteria is used to provide clinical guidance. The NEXUS (National Emergency X-Radiography Utilisation Study) criteria is a validated clinical decision-making tool used to facilitate selective c-spine radiography. The criteria allow clinicians to determine whether cervical spine imaging can be safely avoided in appropriate patients. The NEXUS criteria are widely used within the Emergency Department setting given their ease of use and relatively straightforward application and are used in the Victorian State Trauma System’s guidelines. This audit utilized retrospective data collection to examine the concordance of c-spine imaging in trauma patients to that of the NEXUS criteria and assess compliance with state guidance on diagnostic imaging in trauma. Of the 183 patients that presented with trauma to the head, neck, or face (244 excluded due to incorrect triage), 98 did not undergo imaging of the c-spine. Out of those 98, 44% fulfilled at least one of the NEXUS criteria, meaning the c-spine could not be clinically cleared as per the current guidelines. The criterion most met was intoxication, comprising 42% (18 of 43), with midline spinal tenderness (or absence of documentation of this) the second most common with 23% (10 of 43). Intoxication being the most met criteria is significant but not unexpected given the cohort of patients seen at St Vincent’s and within many emergency departments in general. Given these patients will always meet NEXUS criteria, an element of clinical judgment is likely needed, or concurrent use of the Canadian C-Spine Rules to exclude the need for imaging. Midline tenderness as a met criterion was often in the context of poor or absent documentation relating to this, emphasizing the importance of clear and accurate assessments. The distracting injury was identified in 7 out of the 43 patients; however, only one of these patients exhibited a thoracic injury (T11 compression fracture), with the remainder comprising injuries to the extremities – some studies suggest that C-spine imaging may not be required in the evaluable blunt trauma patient despite distracting injuries in any body regions that do not involve the upper chest. This emphasises the need for standardised definitions for distracting injury, at least at a departmental/regional level. The data highlights the currently poor application of the NEXUS guidelines, with likely common themes throughout emergency departments, highlighting the need for further education regarding implementation and potential refinement/clarification of criteria. Of note, there appeared to be no significant differences between levels of experience with respect to inappropriately clearing the c-spine clinically with respect to the guidelines.

Keywords: imaging, guidelines, emergency medicine, audit

Procedia PDF Downloads 48
1788 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 175
1787 Production, Quality Control, and Biodistribution Assessment of 111In-BPAMD as a New Bone Imaging Agent

Authors: H. Yousefnia, A. Aghanejad, A. Mirzaei, R. Enayati, A. R. Jalilian, S. Zolghadri

Abstract:

Bone metastases occur in many cases at an early stage of the tumour disease; however, their symptoms are recognized rather late. The aim of this study was the preparation and quality control of 111In-BPAMD for diagnostic purposes. 111In was produced at the Agricultural, Medical, and Industrial Research School (AMIRS) by means of 30 MeV cyclotron via natCd(p,x)111In reaction. Complexion of In‐111 with BPAMD was carried out by using acidic solution of 111InCl3 and BPAMD in absolute water. The effect of various parameters such as temperature, ligand concentration, pH, and time on the radiolabeled yield was studied. 111In-BPAMD was prepared successfully with the radiochemical purity of 95% at the optimized condition (100 µg of BPAMD, pH=5, and at 90°C for 1 h) which was measured by ITLC method. The final solution was injected to wild-type mice and biodistribution was determined up to 72 h. SPECT images were acquired after 2 and 24 h post injection. Both the biodistribution studies and SPECT imaging indicated high bone uptake while accumulation in other organs was approximately negligible. The results show that 111In-BPAMD can be used as an excellent tracer for diagnosis of bone metastases by SPECT imaging.

Keywords: biodistribution, BPAMD, 111In, SPECT

Procedia PDF Downloads 533
1786 Distributed Coordination of Connected and Automated Vehicles at Multiple Interconnected Intersections

Authors: Zhiyuan Du, Baisravan Hom Chaudhuri, Pierluigi Pisu

Abstract:

In connected vehicle systems where wireless communication is available among the involved vehicles and intersection controllers, it is possible to design an intersection coordination strategy that leads the connected and automated vehicles (CAVs) travel through the road intersections without the conventional traffic light control. In this paper, we present a distributed coordination strategy for the CAVs at multiple interconnected intersections that aims at improving system fuel efficiency and system mobility. We present a distributed control solution where in the higher level, the intersection controllers calculate the road desired average velocity and optimally assign reference velocities of each vehicle. In the lower level, every vehicle is considered to use model predictive control (MPC) to track their reference velocity obtained from the higher level controller. The proposed method has been implemented on a simulation-based case with two-interconnected intersection network. Additionally, the effects of mixed vehicle types on the coordination strategy has been explored. Simulation results indicate the improvement on vehicle fuel efficiency and traffic mobility of the proposed method.

Keywords: connected vehicles, automated vehicles, intersection coordination systems, multiple interconnected intersections, model predictive control

Procedia PDF Downloads 327
1785 Wikipedia World: A Computerized Process for Cultural Heritage Data Dissemination

Authors: L. Rajaonarivo, M. N. Bessagnet, C. Sallaberry, A. Le Parc Lacayrelle, L. Leveque

Abstract:

TCVPYR is a European FEDER (European Regional Development Fund) project which aims to promote tourism in the French Pyrenees region by leveraging its cultural heritage. It involves scientists from various domains (geographers, historians, anthropologists, computer scientists...). This paper presents a fully automated process to publish any dataset as Wikipedia articles as well as the corresponding linked information on Wikidata and Wikimedia Commons. We validate this process on a sample of geo-referenced cultural heritage data collected by TCVPYR researchers in different regions of the Pyrenees. The main result concerns the technological prerequisites, which are now in place. Moreover, we demonstrated that we can automatically publish cultural heritage data on Wikimedia.

Keywords: cultural heritage dissemination, digital humanities, open data, Wikimedia automated publishing

Procedia PDF Downloads 99
1784 External Noise Distillation in Quantum Holography with Undetected Light

Authors: Sebastian Töpfer, Jorge Fuenzalida, Marta Gilaberte Basset, Juan P. Torres, Markus Gräfe

Abstract:

This work presents an experimental and theoretical study about the noise resilience of quantum holography with undetected photons. Quantum imaging has become an important research topic in the recent years after its first publication in 2014. Following this research, advances towards different spectral ranges in detection and different optical geometries have been made. Especially an interest in the field of near infrared to mid infrared measurements has developed, because of the unique characteristic, that allows to sample a probe with photons in a different wavelength than the photons arriving at the detector. This promising effect can be used for medical applications, to measure in the so-called molecule fingerprint region, while using broadly available detectors for the visible spectral range. Further advance the development of quantum imaging methods have been made by new measurement and detection schemes. One of which is quantum holography with undetected light. It combines digital phase shifting holography with quantum imaging to extent the obtainable sample information, by measuring not only the object transmission, but also its influence on the phase shift experienced by the transmitted light. This work will present extended research for the quantum holography with undetected light scheme regarding the influence of external noise. It is shown experimentally and theoretically that the samples information can still be at noise levels of 250 times higher than the signal level, because of its information being transmitted by the interferometric pattern. A detailed theoretic explanation is also provided.

Keywords: distillation, quantum holography, quantum imaging, quantum metrology

Procedia PDF Downloads 35
1783 Morphology of Cartographic Words: A Perspective from Chinese Characters

Authors: Xinyu Gong, Zhilin Li, Xintao Liu

Abstract:

Maps are a means of communication. Cartographic language involves established theories of natural language for understanding maps. “Cartographic words’, or “map symbols”, are crucial elements of cartographic language. Personalized mapping is increasingly popular, with growing demands for customized map-making by the general public. Automated symbol-making and customization play a key role in personalized mapping. However, formal representations for the automated construction of map symbols are still lacking. In natural language, the process of word and sentence construction can be formalized. Through the analogy between natural language and graphical language, formal representations of natural language construction can be used as a reference for constructing cartographic language. We selected Chinese character structures (i.e., S

Keywords: personalized mapping, Chinese character, cartographic language, map symbols

Procedia PDF Downloads 140
1782 Scaling Siamese Neural Network for Cross-Domain Few Shot Learning in Medical Imaging

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Cross-domain learning in the medical field is a research challenge as many conditions, like in oncology imaging, use different imaging modalities. Moreover, in most of the medical learning applications, the sample training size is relatively small. Although few-shot learning (FSL) through the use of a Siamese neural network was able to be trained on a small sample with remarkable accuracy, FSL fails to be effective for use in multiple domains as their convolution weights are set for task-specific applications. In this paper, we are addressing this problem by enabling FSL to possess the ability to shift across domains by designing a two-layer FSL network that can learn individually from each domain and produce a shared features map with extra modulation to be used at the second layer that can recognize important targets from mix domains. Our initial experimentations based on mixed medical datasets like the Medical-MNIST reveal promising results. We aim to continue this research to perform full-scale analytics for testing our cross-domain FSL learning.

Keywords: Siamese neural network, few-shot learning, meta-learning, metric-based learning, thick data transformation and analytics

Procedia PDF Downloads 7
1781 Initial Dip: An Early Indicator of Neural Activity in Functional Near Infrared Spectroscopy Waveform

Authors: Mannan Malik Muhammad Naeem, Jeong Myung Yung

Abstract:

Functional near infrared spectroscopy (fNIRS) has a favorable position in non-invasive brain imaging techniques. The concentration change of oxygenated hemoglobin and de-oxygenated hemoglobin during particular cognitive activity is the basis for this neuro-imaging modality. Two wavelengths of near-infrared light can be used with modified Beer-Lambert law to explain the indirect status of neuronal activity inside brain. The temporal resolution of fNIRS is very good for real-time brain computer-interface applications. The portability, low cost and an acceptable temporal resolution of fNIRS put it on a better position in neuro-imaging modalities. In this study, an optimization model for impulse response function has been used to estimate/predict initial dip using fNIRS data. In addition, the activity strength parameter related to motor based cognitive task has been analyzed. We found an initial dip that remains around 200-300 millisecond and better localize neural activity.

Keywords: fNIRS, brain-computer interface, optimization algorithm, adaptive signal processing

Procedia PDF Downloads 194
1780 Comparison between High Resolution Ultrasonography and Magnetic Resonance Imaging in Assessment of Musculoskeletal Disorders Causing Ankle Pain

Authors: Engy S. El-Kayal, Mohamed M. S. Arafa

Abstract:

There are various causes of ankle pain including traumatic and non-traumatic causes. Various imaging techniques are available for assessment of AP. MRI is considered to be the imaging modality of choice for ankle joint evaluation with an advantage of its high spatial resolution, multiplanar capability, hence its ability to visualize small complex anatomical structures around the ankle. However, the high costs and the relatively limited availability of MRI systems, as well as the relatively long duration of the examination all are considered disadvantages of MRI examination. Therefore there is a need for a more rapid and less expensive examination modality with good diagnostic accuracy to fulfill this gap. HRU has become increasingly important in the assessment of ankle disorders, with advantages of being fast, reliable, of low cost and readily available. US can visualize detailed anatomical structures and assess tendinous and ligamentous integrity. The aim of this study was to compare the diagnostic accuracy of HRU with MRI in the assessment of patients with AP. We included forty patients complaining of AP. All patients were subjected to real-time HRU and MRI of the affected ankle. Results of both techniques were compared to surgical and arthroscopic findings. All patients were examined according to a defined protocol that includes imaging the tendon tears or tendinitis, muscle tears, masses, or fluid collection, ligament sprain or tears, inflammation or fluid effusion within the joint or bursa, bone and cartilage lesions, erosions and osteophytes. Analysis of the results showed that the mean age of patients was 38 years. The study comprised of 24 women (60%) and 16 men (40%). The accuracy of HRU in detecting causes of AP was 85%, while the accuracy of MRI in the detection of causes of AP was 87.5%. In conclusions: HRU and MRI are two complementary tools of investigation with the former will be used as a primary tool of investigation and the latter will be used to confirm the diagnosis and the extent of the lesion especially when surgical interference is planned.

Keywords: ankle pain (AP), high-resolution ultrasound (HRU), magnetic resonance imaging (MRI) ultrasonography (US)

Procedia PDF Downloads 170
1779 A Comparison between Different Segmentation Techniques Used in Medical Imaging

Authors: Ibtihal D. Mustafa, Mawia A. Hassan

Abstract:

Tumor segmentation from MRI image is important part of medical images experts. This is particularly a challenging task because of the high assorting appearance of tumor tissue among different patients. MRI images are advance of medical imaging because it is give richer information about human soft tissue. There are different segmentation techniques to detect MRI brain tumor. In this paper, different procedure segmentation methods are used to segment brain tumors and compare the result of segmentations by using correlation and structural similarity index (SSIM) to analysis and see the best technique that could be applied to MRI image.

Keywords: MRI, segmentation, correlation, structural similarity

Procedia PDF Downloads 376
1778 Low-Density Lipoproteins Mediated Delivery of Paclitaxel and MRI Imaging Probes for Personalized Medicine Applications

Authors: Sahar Rakhshan, Simonetta Geninatti Crich, Diego Alberti, Rachele Stefania

Abstract:

The combination of imaging and therapeutic agents in the same smart nanoparticle is a promising option to perform a minimally invasive imaging guided therapy. In this study, Low density lipoproteins (LDL), one of the most attractive biodegradable and biocompatible nanoparticles, were used for the simultaneous delivery of Paclitaxel (PTX), a hydrophobic antitumour drug and an amphiphilic contrast agent, Gd-AAZTA-C17, in B16-F10 melanoma cell line. These cells overexpress LDL receptors, as assessed by Flow cytometry analysis. PTX and Gd-AAZTA-C17 loaded LDLs (LDL-PTX-Gd) have been prepared, characterized and their stability was assessed under 72 h incubation at 37 ◦C and compared to LDL loaded with Gd-AAZTA-C17 (LDL-Gd) and LDL-PTX. The cytotoxic effect of LDL-PTX-Gd was evaluated by MTT assay. The anti-tumour drug loaded into LDLs showed a significantly higher toxicity on B16-F10 cells with respect to the commercially available formulation Paclitaxel Kabi (PTX Kabi) used in clinical applications. It was possible to demonstrate a high uptake of LDL-Gd in B16-F10 cells. As a consequence of the high cell uptake, melanoma cells showed significantly high cytotoxic effect when incubated in the presence of PTX (LDL-PTX-Gd). Furthermore, B16-F10 have been used to perform Magnetic Resonance Imaging. By the analysis of the image signal intensity, it was possible to extrapolate the amount of internalized PTX indirectly by the decrease of relaxation times caused by Gd, proportional to its concentration. Finally, the treatment with PTX loaded LDL on B16-F10 tumour bearing mice resulted in a marked reduction of tumour growth compared to the administration of PTX Kabi alone. In conclusion, LDLs are selectively taken-up by tumour cells and can be successfully exploited for the selective delivery of Paclitaxel and imaging agents.

Keywords: low density lipoprotein, melanoma cell lines, MRI, paclitaxel, personalized medicine application, theragnostic System

Procedia PDF Downloads 95
1777 Effect of Concentration Level and Moisture Content on the Detection and Quantification of Nickel in Clay Agricultural Soil in Lebanon

Authors: Layan Moussa, Darine Salam, Samir Mustapha

Abstract:

Heavy metal contamination in agricultural soils in Lebanon poses serious environmental and health problems. Intensive efforts are employed to improve existing quantification methods of heavy metals in contaminated environments since conventional detection techniques have shown to be time-consuming, tedious, and costly. The implication of hyperspectral remote sensing in this field is possible and promising. However, factors impacting the efficiency of hyperspectral imaging in detecting and quantifying heavy metals in agricultural soils were not thoroughly studied. This study proposes to assess the use of hyperspectral imaging for the detection of Ni in agricultural clay soil collected from the Bekaa Valley, a major agricultural area in Lebanon, under different contamination levels and soil moisture content. Soil samples were contaminated with Ni, with concentrations ranging from 150 mg/kg to 4000 mg/kg. On the other hand, soil with background contamination was subjected to increased moisture levels varying from 5 to 75%. Hyperspectral imaging was used to detect and quantify Ni contamination in the soil at different contamination levels and moisture content. IBM SPSS statistical software was used to develop models that predict the concentration of Ni and moisture content in agricultural soil. The models were constructed using linear regression algorithms. The spectral curves obtained reflected an inverse correlation between both Ni concentration and moisture content with respect to reflectance. On the other hand, the models developed resulted in high values of predicted R2 of 0.763 for Ni concentration and 0.854 for moisture content. Those predictions stated that Ni presence was well expressed near 2200 nm and that of moisture was at 1900 nm. The results from this study would allow us to define the potential of using the hyperspectral imaging (HSI) technique as a reliable and cost-effective alternative for heavy metal pollution detection in contaminated soils and soil moisture prediction.

Keywords: heavy metals, hyperspectral imaging, moisture content, soil contamination

Procedia PDF Downloads 68
1776 Mathematical Description of Functional Motion and Application as a Feeding Mode for General Purpose Assistive Robots

Authors: Martin Leroux, Sylvain Brisebois

Abstract:

Eating a meal is among the Activities of Daily Living, but it takes a lot of time and effort for people with physical or functional limitations. Dedicated technologies are cumbersome and not portable, while general-purpose assistive robots such as wheelchair-based manipulators are too hard to control for elaborate continuous motion like eating. Eating with such devices has not previously been automated, since there existed no description of a feeding motion for uncontrolled environments. In this paper, we introduce a feeding mode for assistive manipulators, including a mathematical description of trajectories for motions that are difficult to perform manually such as gathering and scooping food at a defined/desired pace. We implement these trajectories in a sequence of movements for a semi-automated feeding mode which can be controlled with a very simple 3-button interface, allowing the user to have control over the feeding pace. Finally, we demonstrate the feeding mode with a JACO robotic arm and compare the eating speed, measured in bites per minute of three eating methods: a healthy person eating unaided, a person with upper limb limitations or disability using JACO with manual control, and a person with limitations using JACO with the feeding mode. We found that the feeding mode allows eating about 5 bites per minute, which should be sufficient to eat a meal under 30min.

Keywords: assistive robotics, automated feeding, elderly care, trajectory design, human-robot interaction

Procedia PDF Downloads 133
1775 Resistivity Tomography Optimization Based on Parallel Electrode Linear Back Projection Algorithm

Authors: Yiwei Huang, Chunyu Zhao, Jingjing Ding

Abstract:

Electrical Resistivity Tomography has been widely used in the medicine and the geology, such as the imaging of the lung impedance and the analysis of the soil impedance, etc. Linear Back Projection is the core algorithm of Electrical Resistivity Tomography, but the traditional Linear Back Projection can not make full use of the information of the electric field. In this paper, an imaging method of Parallel Electrode Linear Back Projection for Electrical Resistivity Tomography is proposed, which generates the electric field distribution that is not linearly related to the traditional Linear Back Projection, captures the new information and improves the imaging accuracy without increasing the number of electrodes by changing the connection mode of the electrodes. The simulation results show that the accuracy of the image obtained by the inverse operation obtained by the Parallel Electrode Linear Back Projection can be improved by about 20%.

Keywords: electrical resistivity tomography, finite element simulation, image optimization, parallel electrode linear back projection

Procedia PDF Downloads 120
1774 Photoplethysmography-Based Device Designing for Cardiovascular System Diagnostics

Authors: S. Botman, D. Borchevkin, V. Petrov, E. Bogdanov, M. Patrushev, N. Shusharina

Abstract:

In this paper, we report the development of the device for diagnostics of cardiovascular system state and associated automated workstation for large-scale medical measurement data collection and analysis. It was shown that optimal design for the monitoring device is wristband as it represents engineering trade-off between accuracy and usability. The monitoring device is based on the infrared reflective photoplethysmographic sensor, which allows collecting multiple physiological parameters, such as heart rate and pulsing wave characteristics. Developed device use BLE interface for medical and supplementary data transmission to the coupled mobile phone, which process it and send it to the doctor's automated workstation. Results of this experimental model approbation confirmed the applicability of the proposed approach.

Keywords: cardiovascular diseases, health monitoring systems, photoplethysmography, pulse wave, remote diagnostics

Procedia PDF Downloads 473
1773 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 102
1772 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 56