Search results for: Digital Image Correlation (DIC)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8725

Search results for: Digital Image Correlation (DIC)

6655 Analysing Time Series for a Forecasting Model to the Dynamics of Aedes Aegypti Population Size

Authors: Flavia Cordeiro, Fabio Silva, Alvaro Eiras, Jose Luiz Acebal

Abstract:

Aedes aegypti is present in the tropical and subtropical regions of the world and is a vector of several diseases such as dengue fever, yellow fever, chikungunya, zika etc. The growth in the number of arboviruses cases in the last decades became a matter of great concern worldwide. Meteorological factors like mean temperature and precipitation are known to influence the infestation by the species through effects on physiology and ecology, altering the fecundity, mortality, lifespan, dispersion behaviour and abundance of the vector. Models able to describe the dynamics of the vector population size should then take into account the meteorological variables. The relationship between meteorological factors and the population dynamics of Ae. aegypti adult females are studied to provide a good set of predictors to model the dynamics of the mosquito population size. The time-series data of capture of adult females of a public health surveillance program from the city of Lavras, MG, Brazil had its association with precipitation, humidity and temperature analysed through a set of statistical methods for time series analysis commonly adopted in Signal Processing, Information Theory and Neuroscience. Cross-correlation, multicollinearity test and whitened cross-correlation were applied to determine in which time lags would occur the influence of meteorological variables on the dynamics of the mosquito abundance. Among the findings, the studied case indicated strong collinearity between humidity and precipitation, and precipitation was selected to form a pair of descriptors together with temperature. In the techniques used, there were observed significant associations between infestation indicators and both temperature and precipitation in short, mid and long terms, evincing that those variables should be considered in entomological models and as public health indicators. A descriptive model used to test the results exhibits a strong correlation to data.

Keywords: Aedes aegypti, cross-correlation, multicollinearity, meteorological variables

Procedia PDF Downloads 173
6654 Dose Saving and Image Quality Evaluation for Computed Tomography Head Scanning with Eye Protection

Authors: Yuan-Hao Lee, Chia-Wei Lee, Ming-Fang Lin, Tzu-Huei Wu, Chih-Hsiang Ko, Wing P. Chan

Abstract:

Computed tomography (CT) scan of the head is a good method for investigating cranial lesions. However, radiation-induced oxidative stress can be accumulated in the eyes and promote carcinogenesis and cataract. In this regard, we aimed to protect the eyes with barium sulfate shield(s) during CT scans and investigate the resultant image quality and radiation dose to the eye. Patients who underwent health examinations were selectively enrolled in this study in compliance with the protocol approved by the Ethics Committee of the Joint Institutional Review Board at Taipei Medical University. Participants’ brains were scanned with a water-based marker simultaneously by a multislice CT scanner (SOMATON Definition Flash) under a fixed tube current-time setting or automatic tube current modulation (TCM). The lens dose was measured by Gafchromic films, whose dose response curve was previously fitted using thermoluminescent dosimeters, with or without barium sulfate or bismuth-antimony shield laid above. For the assessment of image quality CT images at slice planes that exhibit the interested regions on the zygomatic, orbital and nasal bones of the head phantom as well as the water-based marker were used for calculating the signal-to-noise and contrast-to-noise ratios. The application of barium sulfate and bismuth-antimony shields decreased 24% and 47% of the lens dose on average, respectively. Under topogram-based TCM, the dose saving power of bismuth-antimony shield was mitigated whereas that of barium sulfate shield was enhanced. On the other hand, the signal-to-noise and contrast-to-noise ratios of DSCT images were decreased separately by barium sulfate and bismuth-antimony shield, resulting in an overall reduction of the CNR. In contrast, the integration of topogram-based TCM elevated signal difference between the ROIs on the zygomatic bones and eyeballs while preferentially decreasing the signal-to-noise ratios upon the use of barium sulfate shield. The results of this study indicate that the balance between eye exposure and image quality can be optimized by combining eye shields with topogram-based TCM on the multislice scanner. Eye shielding could change the photon attenuation characteristics of tissues that are close to the shield. The application of both shields on eye protection hence is not recommended for seeking intraorbital lesions.

Keywords: computed tomography, barium sulfate shield, dose saving, image quality

Procedia PDF Downloads 259
6653 High Performance Methyl Orange Capture on Magnetic Nanoporous MCM-41 Prepared by Incipient Wetness Impregnation Method

Authors: Talib M. Albayati, Omar S. Mahdy, Ghanim M. Alwan

Abstract:

This work is aimed to prepare magnetic nanoporous material Fe/MCM-41 and study its Physical characterization in order to enhance the magnetic properties for study the operating conditions on separation efficiency of methyl orange (MO) from wastewater by adsorption process. The experimental results are analysed to select the best operating conditions for different studied parameters which were obtained for both adsorbents mesoporous material samples MCM-41 and magnetic Fe/MCM-41 as follow: constant temperature (20 ºC), pH: (2) adsorbent dosage (0.03 gm), contact time (10 minute) and concentrations (30 mg/L). The results are demonstrated that the adsorption processes can be well fitted by the Langmuir isotherm model for pure MCM-41 with a higher correlation coefficient (0.999) and fitted by the freundlich isotherm model for magnetic Fe/MCM-41 with a higher correlation coefficient of (0.994).

Keywords: adsorption, nanoporous materials, mcm-41, magnetic material, wastewater, orange, wastewater

Procedia PDF Downloads 385
6652 Developing Digital Competencies in Aboriginal Students through University-College Partnerships

Authors: W. S. Barber, S. L. King

Abstract:

This paper reports on a pilot project to develop a collaborative partnership between a community college in rural northern Ontario, Canada, and an urban university in the greater Toronto area in Oshawa, Canada. Partner institutions will collaborate to address learning needs of university applicants whose goals are to attain an undergraduate university BA in Educational Studies and Digital Technology degree, but who may not live in a geographical location that would facilitate this pathways process. The UOIT BA degree is attained through a 2+2 program, where students with a 2 year college diploma or equivalent can attain a four year undergraduate degree. The goals reported on the project are as: 1. Our aim is to expand the BA program to include an additional stream which includes serious educational games, simulations and virtual environments, 2. Develop fully (using both synchronous and asynchronous technologies) online learning modules for use by university applicants who otherwise are not geographically located close to a physical university site, 3. Assess the digital competencies of all students, including members of local, distance and Indigenous communities using a validated tool developed and tested by UOIT across numerous populations. This tool, the General Technical Competency Use and Scale (GTCU) will provide the collaborating institutions with data that will allow for analyzing how well students are prepared to succeed in fully online learning communities. Philosophically, the UOIT BA program is based on a fully online learning communities model (FOLC) that can be accessed from anywhere in the world through digital learning environments via audio video conferencing tools such as Adobe Connect. It also follows models of adult learning and mobile learning, and makes a university degree accessible to the increasing demographic of adult learners who may use mobile devices to learn anywhere anytime. The program is based on key principles of Problem Based Learning, allowing students to build their own understandings through the co-design of the learning environment in collaboration with the instructors and their peers. In this way, this degree allows students to personalize and individualize the learning based on their own culture, background and professional/personal experiences. Using modified flipped classroom strategies, students are able to interrogate video modules on their own time in preparation for one hour discussions occurring in video conferencing sessions. As a consequence of the program flexibility, students may continue to work full or part time. All of the partner institutions will co-develop four new modules, administer the GTCU and share data, while creating a new stream of the UOIT BA degree. This will increase accessibility for students to bridge from community colleges to university through a fully digital environment. We aim to work collaboratively with Indigenous elders, community members and distance education instructors to increase opportunities for more students to attain a university education.

Keywords: aboriginal, college, competencies, digital, universities

Procedia PDF Downloads 211
6651 Relationship between the Level of Perceived Self-Efficacy of Children with Learning Disability and Their Mother’s Perception about the Efficacy of Their Child, and Children’s Academic Achievement

Authors: Payal Maheshwari, Maheaswari Brindavan

Abstract:

The present study aimed at studying the level of perceived self-efficacy of children with learning disability and their mother’s perception about the efficacy of the child and the relationship between the two. The study further aimed at finding out the relationship between the level of perceived self-efficacy of children with learning disability and their academic achievement and their mother’s perception about the Efficacy of the child and child’s Academic Achievement. The sample comprised of 80 respondents (40 children with learning disability and their mothers). Children with learning disability as their primary condition, belonging to middle or upper middle class, living with both the parents, residing in Mumbai and their mothers were selected. Purposive or judgmental and snowball sampling technique was used to select the sample for the present study. Proformas in the form of questionnaires were used to obtain the background information of the children with learning disability and their mother’s. A self-constructed Mother’s Perceived Efficacy of their Child Assessment Scale was used to measure mothers perceived level of efficacy of their child with learning disability. Self-constructed Child’s Perceived Self-Efficacy Assessment Scale was used to measure the level of child’s perceived self-efficacy. Academic scores of the child were collected from the child’s parents or teachers and were converted into percentage. The data were analyzed quantitatively using frequencies, mean and standard deviation. Correlations were computed to ascertain the relationships between the different variables. The findings revealed that majority of the mother’s perceived efficacy about their child with learning disability was above average as well as majority of the children with learning disability also perceived themselves as having above average level of self-efficacy. Further in the domains of self-regulated learning and emotional self-efficacy majority of the mothers perceived their child as having average or below average efficacy, 50% of the children also perceived their self-efficacy in the two domains at average or below average level. A significant (r=.322, p < .05) weak correlation (Spearman’s rho) was found between mother’s perceived efficacy about their child, and child’s perceived self-efficacy and a significant (r=.377, p < .01) weak correlation (Pearson Correlation) was also found between mother’s perceived efficacy about their child and child’s academic achievement. Significant weak positive correlation was found between child’s perceived self-efficacy and academic achievement (r=.332, p < .05). Based on the findings, the study discussed the need for intervention program for children in non-academic skills like self-regulation and emotional competence.

Keywords: learning disability, perceived self efficacy, academic achievement, mothers, children

Procedia PDF Downloads 313
6650 An Experimental Investigation of Air Entrainment Due to Water Jets in Crossflows

Authors: Mina Esmi Jahromi, Mehdi Khiadani

Abstract:

Vertical water jets discharging into free surface turbulent cross flows result in the ingression of a large amount of air in the body of water and form a region of two-phase air-water flow with a considerable interfacial area. This research presents an experimental study of the two-phase bubbly flow using image processing technique. The air ingression and the trajectories of bubble swarms under different experimental conditions are evaluated. The rate of air entrainment and the bubble characteristics such as penetration depth, and dispersion pattern were found to be affected by the most influential parameters of water jet and cross flow including water jet-to-crossflow velocity ratio, water jet falling height, and cross flow depth. This research improves understanding of the underwater flow structure due to the water jet impingement in crossflow and advances the practical applications of water jets such as artificial aeration, circulation, and mixing where crossflow is present.

Keywords: air entrainment, image processing, jet in cross flow, two-phase flow

Procedia PDF Downloads 365
6649 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis

Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha

Abstract:

Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.

Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier

Procedia PDF Downloads 464
6648 Dirty Martini vs Martini: The Contrasting Duality Between Big Bang and BTS Public Image and Their Latest MVs Analysis

Authors: Patricia Portugal Marques de Carvalho Lourenco

Abstract:

Big Bang is like a dirty martini embroiled in a stew of personal individual scandals that have rocked the group’s image and perception, from G-Dragon’s and T.O.P. marijuana episodes in 2011 and 2016, respectively, to Daesung’s building illicit entertainment activities in 2018to the Burning Sun shebang that led to the Titanic sink of Big Bang’s youngest member Seungri in 2019 and the positive sentiment migration to the antithetical side. BTS, on the other hand, are like a martini, clear, clean, attracting as many crowds to their performances and online content as the Pope attracts believers to Sunday Mass in the Vatican, as exemplified by their latest MVs. Big Bang’s 2022 Still Life achieved 16.4 million views on Youtube in 24hours, whilst BTS Permission to Dance achieved 68.5 million in the same period of time. The difference is significant when added Big Bang’s and BTS overall award wins, a total of 117 in contrast to 460. Both groups are uniquely talented and exceptional performers that have been contributing greatly to the dissemination of Korean Pop Music on a global scale in their own inimitable ways. Both are exceptional in their own right and while the artists cannot, ought not, should not be compared for the grave injustice made in comparing one individual planet with one solar system, a contrast is merited and hence done. The reality, nonetheless, is about disengagement from a group that lives life humanly, learning and evolving with each challenge and mistake without a clean, perfect tag attached to it, demonstrating not only an inability to disassociate the person from the artist and the music but also an inability to understand the difference between a private and public life.

Keywords: K-Pop, big bang, BTS, music, public image, entertainment, korean entertainment

Procedia PDF Downloads 95
6647 Correlation between Body Mass Index and Blood Sugar/Serum Lipid Levels in Fourth-Grade Boys in Japan

Authors: Kotomi Yamashita, Hiromi Kawasaki, Satoko Yamasaki, Susumu Fukita, Risako Sakai

Abstract:

Lifestyle-related diseases develop from the long-term accumulation of health consequences from a poor lifestyle. Thus, schoolchildren, who have not accumulated long-term lifestyle habits, are believed to be at a lower risk for lifestyle-related diseases. However, schoolchildren rarely receive blood tests unless they are under treatment for a serious disease; without such data on their blood, the impacts of their young lifestyle could not be known. Blood data from physical measurements can help in the implementation of more effective health education. Therefore, we examined the correlation between body mass index (BMI) and blood sugar/serum lipid (BS/SL) levels. From 2014 to 2016, we measured the blood data of fourth-grade students living in a city in Japan. The present study reported on the results of 281 fourth-grade boys only (80.3% of total). We analyzed their BS/SL levels by comparing the blood data against the criteria of the National Center for Child Health and Development in Japan. Next, we examined the correlation between BMI and BS/SL levels. IBM SPSS Statistics for Windows, Version 25 was used for analysis. A total of 69 boys (24.6%) were within the normal range for BMI (18.5–24), whereas 193 (71.5%) and 8 boys (2.8%) had lower and higher BMI, respectively. Regarding BS levels, 280 boys were within the normal range (70–90 mg/dl); 1 boy reported a higher value. All the boys were within the normal range for glycated Hemoglobin (HbA1c) (4.6–6.2%). Regarding SL levels, 271 boys were within the normal range (125–230 mg/dl) for total cholesterol (TC), whereas 5 boys (1.8%) had lower and 5 boys (1.8%) had higher levels. A total of 243 boys (92.7%) were within the normal range (36-138mg/dL) for triglycerides (TG), whereas 19 boys (7.3%) had lower and 19 boys (7.3%) had higher levels. Regarding high-density lipoprotein cholesterol (HDL-C), 276 boys (98.2%) were within the normal range (40-mg/dl), whereas 5 boys (1.8%) reported lower values. All but one boy (280, 99.6%) were within the normal range (-170 mg/dl) for low-density lipoprotein cholesterol (LDL-C); the exception (0.4%) had a higher level. BMI and BS didn’t show a correlation. BMI and HbA1c were moderately positively correlated (r = 0.139, p=0.019). We also observed moderate positive correlations between BMI and TG (r = 0.328, p < 0.01), TC (r=0.239, p< 0.01), LDL-C (r = 0.324, p < 0.01), respectively. BMI and HDL-C were low correlated (r = -0.185, p = 0.002). Most of the boys were within the normal range for BS/SL levels. However, some boys exceeded the normal TG range. Fourth graders with a high TG may develop a lifestyle-related disease in the future. Given its relation to TG, food habits should be improved in this group. Our findings suggested a positive correlation between BMI and BS/SL levels. Fourth-grade schoolboys with a high BMI may be at high risk for developing lifestyle-related diseases. Lifestyle improvement may be recommended to lower the BS/SL levels in this group.

Keywords: blood sugar level, lifestyle-related diseases, school students, serum lipid level

Procedia PDF Downloads 135
6646 The Impact of Digitalization and Sustainability on Professionals’ Performance in the Built Environment in Nigeria

Authors: Taiwo, Richard Oluseyi, Morakinyo, Kolawole O., Oyeniran, Demilade O.

Abstract:

This study examines the effects of digitalization and sustainability on professionals' performance within the built environment. By examining the interplay between these two transformative forces, the study seeks to unravel the complexities and opportunities presented by digital technologies in fostering sustainable practices across various professional disciplines. Through an extensive analysis of literature and expert interviews, this research explores how digitalization can enhance professionals' abilities to incorporate sustainability principles, optimize resource utilization, and promote resilient and inclusive built environments. Furthermore, it examines the challenges and barriers professionals face in adapting to and harnessing the potential of digital tools and processes. The findings will contribute to a greater comprehension of the beneficial interactions between digitalization and sustainable development and provide valuable insights for policymakers, practitioners, and educators in fostering an ecosystem that supports professionals' capacity building, collaboration, and innovation toward achieving sustainable goals in the built environment.

Keywords: digitisation, sustainability, professional performance, built environment

Procedia PDF Downloads 17
6645 A Particle Image Velocimetric (PIV) Experiment on Simplified Bottom Hole Flow Field

Authors: Heqian Zhao, Huaizhong Shi, Zhongwei Huang, Zhengliang Chen, Ziang Gu, Fei Gao

Abstract:

Hydraulics mechanics is significantly important in the drilling process of oil or gas exploration, especially for the drill bit. The fluid flows through the nozzles on the bit and generates a water jet to remove the cutting at the bottom hole. In this paper, a simplified bottom hole model is established. The Particle Image Velocimetric (PIV) is used to capture the flow field of the single nozzle. Due to the limitation of the bottom and wellbore, the potential core is shorter than that of the free water jet. The velocity magnitude rapidly attenuates when fluid close to the bottom is lower than about 5 mm. Besides, a vortex zone appears near the middle of the bottom beside the water jet zone. A modified exponential function can be used to fit the centerline velocity well. On the one hand, the results of this paper can provide verification for the numerical simulation of the bottom hole flow field. On the other hand, it also can provide an experimental basis for the hydraulic design of the drill bit.

Keywords: oil and gas, hydraulic mechanic of drilling, PIV, bottom hole

Procedia PDF Downloads 206
6644 Analysis of Knowledge Circulation in Digital Learning Environments: A Case Study of the MOOC 'Communication des Organisations'

Authors: Hasna Mekkaoui Alaoui, Mariem Mekkaoui Alaoui

Abstract:

In a context marked by a growing and pressing demand for online training within Moroccan universities, massive open online courses (Moocs) are undergoing constant evolution, amplified by the widespread use of digital technology and accentuated by the Coronavirus pandemic. However, despite their growing popularity and expansion, these courses are still lacking in terms of tools, enabling teachers and researchers to carry out a fine-grained analysis of the learning processes taking place within them. What's more, the circulation and sharing of knowledge within these environments is becoming increasingly important. The crucial aspect of traceability emerges here, as MOOCs record and generate traces from the most minute to the most visible. This leads us to consider traceability as a valuable approach in the field of educational research, where the trace is envisaged as a research tool in its own right. In this exploratory research project, we are looking at aspects of community knowledge sharing based on traces observed in the "Communication des organisations" Mooc. Focusing in particular on the mediating trace and its impact in identifying knowledge circulation processes in this learning space, we have mobilized the traces of video capsules as an index of knowledge circulation in the Mooc device. Our study uses a methodological approach based on thematic analysis, and although the results show that learners reproduce knowledge from different video vignettes in almost identical ways, they do not limit themselves to the knowledge provided to them. This research offers concrete perspectives for improving the dynamics of online devices, with a potentially positive impact on the quality of online university teaching.

Keywords: circulation, index, digital environments, mediation., trace

Procedia PDF Downloads 56
6643 Satellite Statistical Data Approach for Upwelling Identification and Prediction in South of East Java and Bali Sea

Authors: Hary Aprianto Wijaya Siahaan, Bayu Edo Pratama

Abstract:

Sea fishery's potential to become one of the nation's assets which very contributed to Indonesia's economy. This fishery potential not in spite of the availability of the chlorophyll in the territorial waters of Indonesia. The research was conducted using three methods, namely: statistics, comparative and analytical. The data used include MODIS sea temperature data imaging results in Aqua satellite with a resolution of 4 km in 2002-2015, MODIS data of chlorophyll-a imaging results in Aqua satellite with a resolution of 4 km in 2002-2015, and Imaging results data ASCAT on MetOp and NOAA satellites with 27 km resolution in 2002-2015. The results of the processing of the data show that the incidence of upwelling in the south of East Java Sea began to happen in June identified with sea surface temperature anomaly below normal, the mass of the air that moves from the East to the West, and chlorophyll-a concentrations are high. In July the region upwelling events are increasingly expanding towards the West and reached its peak in August. Chlorophyll-a concentration prediction using multiple linear regression equations demonstrate excellent results to chlorophyll-a concentrations prediction in 2002 until 2015 with the correlation of predicted chlorophyll-a concentration indicate a value of 0.8 and 0.3 with RMSE value. On the chlorophyll-a concentration prediction in 2016 indicate good results despite a decline in the value of the correlation, where the correlation of predicted chlorophyll-a concentration in the year 2016 indicate a value 0.6, but showed improvement in RMSE values with 0.2.

Keywords: satellite, sea surface temperature, upwelling, wind stress

Procedia PDF Downloads 153
6642 Digital Immunity System for Healthcare Data Security

Authors: Nihar Bheda

Abstract:

Protecting digital assets such as networks, systems, and data from advanced cyber threats is the aim of Digital Immunity Systems (DIS), which are a subset of cybersecurity. With features like continuous monitoring, coordinated reactions, and long-term adaptation, DIS seeks to mimic biological immunity. This minimizes downtime by automatically identifying and eliminating threats. Traditional security measures, such as firewalls and antivirus software, are insufficient for enterprises, such as healthcare providers, given the rapid evolution of cyber threats. The number of medical record breaches that have occurred in recent years is proof that attackers are finding healthcare data to be an increasingly valuable target. However, obstacles to enhancing security include outdated systems, financial limitations, and a lack of knowledge. DIS is an advancement in cyber defenses designed specifically for healthcare settings. Protection akin to an "immune system" is produced by core capabilities such as anomaly detection, access controls, and policy enforcement. Coordination of responses across IT infrastructure to contain attacks is made possible by automation and orchestration. Massive amounts of data are analyzed by AI and machine learning to find new threats. After an incident, self-healing enables services to resume quickly. The implementation of DIS is consistent with the healthcare industry's urgent requirement for resilient data security in light of evolving risks and strict guidelines. With resilient systems, it can help organizations lower business risk, minimize the effects of breaches, and preserve patient care continuity. DIS will be essential for protecting a variety of environments, including cloud computing and the Internet of medical devices, as healthcare providers quickly adopt new technologies. DIS lowers traditional security overhead for IT departments and offers automated protection, even though it requires an initial investment. In the near future, DIS may prove to be essential for small clinics, blood banks, imaging centers, large hospitals, and other healthcare organizations. Cyber resilience can become attainable for the whole healthcare ecosystem with customized DIS implementations.

Keywords: digital immunity system, cybersecurity, healthcare data, emerging technology

Procedia PDF Downloads 60
6641 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images

Authors: Mehrnoosh Omati, Mahmod Reza Sahebi

Abstract:

The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.

Keywords: coupled Markov random field (MRF), environment, object-based analysis, polarimetric SAR (PolSAR) images

Procedia PDF Downloads 211
6640 Perception of Predictive Confounders for the Prevalence of Hypertension among Iraqi Population: A Pilot Study

Authors: Zahraa Albasry, Hadeel D. Najim, Anmar Al-Taie

Abstract:

Background: Hypertension is considered as one of the most important causes of cardiovascular complications and one of the leading causes of worldwide mortality. Identifying the potential risk factors associated with this medical health problem plays an important role in minimizing its incidence and related complications. The objective of this study is to explore the prevalence of receptor sensitivity regarding assess and understand the perception of specific predictive confounding factors on the prevalence of hypertension (HT) among a sample of Iraqi population in Baghdad, Iraq. Materials and Methods: A randomized cross sectional study was carried out on 100 adult subjects during their visit to the outpatient clinic at a certain sector of Baghdad Province, Iraq. Demographic, clinical and health records alongside specific screening and laboratory tests of the participants were collected and analyzed to detect the potential of confounding factors on the prevalence of HT. Results: 63% of the study participants suffered from HT, most of them were female patients (P < 0.005). Patients aged between 41-50 years old significantly suffered from HT than other age groups (63.5%, P < 0.001). 88.9% of the participants were obese (P < 0.001) and 47.6% had diabetes with HT. Positive family history and sedentary lifestyle were significantly higher among all hypertensive groups (P < 0.05). High salt and fatty food intake was significantly found among patients suffered from isolated systolic hypertension (ISHT) (P < 0.05). A significant positive correlation between packed cell volume (PCV) and systolic blood pressure (SBP) (r = 0.353, P = 0.048) found among normotensive participants. Among hypertensive patients, a positive significant correlation found between triglycerides (TG) and both SBP (r = 0.484, P = 0.031) and diastolic blood pressure (DBP) (r = 0.463, P = 0.040), while low density lipoprotein-cholesterol (LDL-c) showed a positive significant correlation with DBP (r = 0.443, P = 0.021). Conclusion: The prevalence of HT among Iraqi populations is of major concern. Further consideration is required to detect the impact of potential risk factors and to minimize blood pressure (BP) elevation and reduce the risk of other cardiovascular complications later in life.

Keywords: Correlation, Hypertension, Iraq, Risk factors

Procedia PDF Downloads 126
6639 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 214
6638 AI-based Digital Healthcare Application to Assess and Reduce Fall Risks in Residents of Nursing Homes in Germany

Authors: Knol Hester, Müller Swantje, Danchenko Natalya

Abstract:

Objective: Falls in older people cause an autonomy loss and result in an economic burden. LCare is an AI-based application to manage fall risks. The study's aim was to assess the effect of LCare use on patient outcomes in nursing homes in Germany. Methods: LCare identifies and monitors fall risks through a 3D-gait analysis and a digital questionnaire, resulting in tailored recommendations on fall prevention. A study was conducted with AOK Baden-Württemberg (01.09.2019- 31.05.2021) in 16 care facilities. Assessments at baseline and follow-up included: a fall risk score; falls (baseline: fall history in the past 12 months; follow-up: a fall record since the last analysis); fall-related injuries and hospitalizations; gait speed; fear of falling; psychological stress; nurses experience on app use. Results: 94 seniors were aged 65-99 years at the initial analysis (average 84±7 years); 566 mobility analyses were carried out in total. On average, the fall risk was reduced by 17.8 % as compared to the baseline (p<0.05). The risk of falling decreased across all subgroups, including a trend in dementia patients (p=0.06), constituting 43% of analyzed patients, and patients with walking aids (p<0.05), constituting 76% of analyzed patients. There was a trend (p<0.1) towards fewer falls and fall-related injuries and hospitalizations (baseline: 23 seniors who fell, 13 injury consequences, 9 hospitalizations; follow-up: 14 seniors who fell, 2 injury consequences, 0 hospitalizations). There was a 16% improvement in gait speed (p<0.05). Residents reported less fear of falling and psychological stress by 38% in both outcomes (p<0.05). 81% of nurses found LCare effective. Conclusions: In the presented study, the use of LCare app was associated with a reduction of fall risk among nursing home residents, improvement of health-related outcomes, and a trend toward reduction in injuries and hospitalizations. LCare may help to improve senior resident care and save healthcare costs.

Keywords: falls, digital healthcare, falls prevention, nursing homes, seniors, AI, digital assessment

Procedia PDF Downloads 126
6637 Data-Centric Anomaly Detection with Diffusion Models

Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu

Abstract:

Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.

Keywords: diffusion models, anomaly detection, data-centric, generative AI

Procedia PDF Downloads 77
6636 Performance Analysis of New Types of Reference Targets Based on Spaceborne and Airborne SAR Data

Authors: Y. S. Zhou, C. R. Li, L. L. Tang, C. X. Gao, D. J. Wang, Y. Y. Guo

Abstract:

Triangular trihedral corner reflector (CR) has been widely used as point target for synthetic aperture radar (SAR) calibration and image quality assessment. The additional “tip” of the triangular plate does not contribute to the reflector’s theoretical RCS and if it interacts with a perfectly reflecting ground plane, it will yield an increase of RCS at the radar bore-sight and decrease the accuracy of SAR calibration and image quality assessment. Regarding this problem, two types of CRs were manufactured. One was the hexagonal trihedral CR. It is a self-illuminating CR with relatively small plate edge length, while large edge length usually introduces unexpected edge diffraction error. The other was the triangular trihedral CR with extended bottom plate which considers the effect of ‘tip’ into the total RCS. In order to assess the performance of the two types of new CRs, flight campaign over the National Calibration and Validation Site for High Resolution Remote Sensors was carried out. Six hexagonal trihedral CRs and two bottom-extended trihedral CRs, as well as several traditional triangular trihedral CRs, were deployed. KOMPSAT-5 X-band SAR image was acquired for the performance analysis of the hexagonal trihedral CRs. C-band airborne SAR images were acquired for the performance analysis of the bottom-extended trihedral CRs. The analysis results showed that the impulse response function of both the hexagonal trihedral CRs and bottom-extended trihedral CRs were much closer to the ideal sinc-function than the traditional triangular trihedral CRs. The flight campaign results validated the advantages of new types of CRs and they might be useful in the future SAR calibration mission.

Keywords: synthetic aperture radar, calibration, corner reflector, KOMPSAT-5

Procedia PDF Downloads 265
6635 Typology of Gaming Tourists Based on the Perception of Destination Image

Authors: Mi Ju Choi

Abstract:

This study investigated the perception of gaming tourists toward Macau and developed a typology of gaming tourists. The 1,497 responses from tourists in Macau were collected through convenience sampling method. The dimensions of multi-culture, convenience, economy, gaming, and unsafety, were subsequently extracted as the factors of perception of gaming tourists in Macau. Cluster analysis was performed using the delineated factors (perception of tourists on Macau). Four heterogonous groups were generated, namely, gaming lovers (n = 467, 31.2%), exotic lovers (n = 509, 34.0%), reasonable budget seekers (n = 269, 18.0%), and convenience seekers (n = 252, 16.8%). Further analysis was performed to investigate any difference in gaming behavior and tourist activities. The findings are expected to contribute to the efforts of destination marketing organizations (DMOs) in establishing effective business strategies, provide a profile of gaming tourists in certain market segments, and assist DMOs and casino managers in establishing more effective marketing strategies for target markets.

Keywords: destination image, gaming tourists, Macau, segmentation

Procedia PDF Downloads 295
6634 The Morphing Avatar of Startup Sales - Destination Virtual Reality

Authors: Sruthi Kannan

Abstract:

The ongoing covid pandemic has accelerated digital transformation like never before. The physical barriers brought in as a result of the pandemic are being bridged by digital alternatives. While basic collaborative activities like voice, video calling, screen sharing have been replicated in these alternatives, there are several others that require a more intimate setup. Pitching, showcasing, and providing demonstrations are an integral part of selling strategies for startups. Traditionally these have been in-person engagements, enabling a depth of understanding of the startups’ offerings. In the new normal scenario of virtual-only connects, startups are feeling the brunt of the lack of in-person connections with potential customers and investors. This poster demonstrates how a virtual reality platform has been conceptualized and custom-built for startups to engage with their stakeholders and redefine their selling strategies. This virtual reality platform is intended to provide an immersive experience for startup showcases and offers the nearest possible alternative to physical meetings for the startup ecosystem, thereby opening newer frontiers for entrepreneurial collaborations.

Keywords: collaboration, sales, startups, strategy, virtual reality

Procedia PDF Downloads 298
6633 Evaluation of Digital Marketing Strategies by Behavioral Economics

Authors: Sajjad Esmaeili Aghdam

Abstract:

Economics typically conceptualizes individual behavior as the consequence of external states, for example, budgets and prices (or respective beliefs) and choices. As the main goal, we focus on the influence of a range of Behavioral Economics factors on Strategies of Digital Marketing, evaluation of strategies and deformation of it into highly prospective marketing strategies. The different forms of behavioral prospects all lead to the succeeding two main results. First, the steadiness of the economic dynamics in a currency union be contingent fatefully on the level of economic incorporation. More economic incorporation leads to more steady economic dynamics. Electronic word-of-mouth (eWOM) is “all casual communications focused at consumers through Internet-based technology connected to the usage or characteristics of specific properties and services or their venders.” eWOM can take many methods, the most significant one being online analyses. Writing this paper, 72 articles have been gathered, focusing on the title and the aim of the article from research search engines like Google Scholar, Web of Science, and PubMed. Recent research in strategic management and marketing proposes that markets should not be viewed as a given and deterministic setting, exogenous to the firm. Instead, firms are progressively abstracted as dynamic inventors of market prospects. The use of new technologies touches all spheres of the modern lifestyle. Social and economic life becomes unbearable without fast, applicable, first-class and fitting material. Psychology and economics (together known as behavioral economics) are two protruding disciplines underlying many theories in marketing. The wide marketing works papers consumers’ none balanced behavior even though behavioral biases might not continuously be steadily called or officially labeled.

Keywords: behavioral economics, digital marketing, marketing strategy, high impact strategies

Procedia PDF Downloads 175
6632 Network Conditioning and Transfer Learning for Peripheral Nerve Segmentation in Ultrasound Images

Authors: Harold Mauricio Díaz-Vargas, Cristian Alfonso Jimenez-Castaño, David Augusto Cárdenas-Peña, Guillermo Alberto Ortiz-Gómez, Alvaro Angel Orozco-Gutierrez

Abstract:

Precise identification of the nerves is a crucial task performed by anesthesiologists for an effective Peripheral Nerve Blocking (PNB). Now, anesthesiologists use ultrasound imaging equipment to guide the PNB and detect nervous structures. However, visual identification of the nerves from ultrasound images is difficult, even for trained specialists, due to artifacts and low contrast. The recent advances in deep learning make neural networks a potential tool for accurate nerve segmentation systems, so addressing the above issues from raw data. The most widely spread U-Net network yields pixel-by-pixel segmentation by encoding the input image and decoding the attained feature vector into a semantic image. This work proposes a conditioning approach and encoder pre-training to enhance the nerve segmentation of traditional U-Nets. Conditioning is achieved by the one-hot encoding of the kind of target nerve a the network input, while the pre-training considers five well-known deep networks for image classification. The proposed approach is tested in a collection of 619 US images, where the best C-UNet architecture yields an 81% Dice coefficient, outperforming the 74% of the best traditional U-Net. Results prove that pre-trained models with the conditional approach outperform their equivalent baseline by supporting learning new features and enriching the discriminant capability of the tested networks.

Keywords: nerve segmentation, U-Net, deep learning, ultrasound imaging, peripheral nerve blocking

Procedia PDF Downloads 97
6631 Financial Markets Integration between Morocco and France: Implications on International Portfolio Diversification

Authors: Abdelmounaim Lahrech, Hajar Bousfiha

Abstract:

This paper examines equity market integration between Morocco and France and its consequent implications on international portfolio diversification. In the absence of stock market linkages, Morocco can act as a diversification destination to European investors, allowing higher returns at a comparable level of risk in developed markets. In contrast, this attractiveness is limited if both financial markets show significant linkage. The research empirically measures financial market’s integration in by capturing the conditional correlation between the two markets using the Generalized Autoregressive Conditionally Heteroscedastic (GARCH) model. Then, the research uses the Dynamic Conditional Correlation (DCC) model of Engle (2002) to track the correlations. The research findings show that there is no important increase over the years in the correlation between the Moroccan and the French equity markets, even though France is considered Morocco’s first trading partner. Failing to prove evidence of the stock index linkage between the two countries, the volatility series of each market were assumed to change over time separately. Yet, the study reveals that despite the important historical and economic linkages between Morocco and France, there is no evidence that equity markets follow. The small correlations and their stationarity over time show that over the 10 years studied, correlations were fluctuating around a stable mean with no significant change at their level. Different explanations can be attributed to the absence of market linkage between the two equity markets.

Keywords: equity market linkage, DCC GARCH, international portfolio diversification, Morocco, France

Procedia PDF Downloads 436
6630 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 194
6629 Factors Influencing Consumer Adoption of Digital Banking Apps in the UK

Authors: Sevelina Ndlovu

Abstract:

Financial Technology (fintech) advancement is recognised as one of the most transformational innovations in the financial industry. Fintech has given rise to internet-only digital banking, a novel financial technology advancement, and innovation that allows banking services through internet applications with no need for physical branches. This technology is becoming a new banking normal among consumers for its ubiquitous and real-time access advantages. There is evident switching and migration from traditional banking towards these fintech facilities, which could possibly pose a systemic risk if not properly understood and monitored. Fintech advancement has also brought about the emergence and escalation of financial technology consumption themes such as trust, security, perceived risk, and sustainability within the banking industry, themes scarcely covered in existing theoretic literature. To that end, the objective of this research is to investigate factors that determine fintech adoption and propose an integrated adoption model. This study aims to establish what the significant drivers of adoption are and develop a conceptual model that integrates technological, behavioral, and environmental constructs by extending the Unified Theory of Acceptance and Use of Technology 2 (UTAUT2). It proposes integrating constructs that influence financial consumption themes such as trust, perceived risk, security, financial incentives, micro-investing opportunities, and environmental consciousness to determine the impact of these factors on the adoption and intention to use digital banking apps. The main advantage of this conceptual model is the consolidation of a greater number of predictor variables that can provide a fuller explanation of the consumer's adoption of digital banking Apps. Moderating variables of age, gender, and income are incorporated. To the best of author’s knowledge, this study is the first that extends the UTAUT2 model with this combination of constructs to investigate user’s intention to adopt internet-only digital banking apps in the UK context. By investigating factors that are not included in the existing theories but are highly pertinent to the adoption of internet-only banking services, this research adds to existing knowledge and extends the generalisability of the UTAUT2 in a financial services adoption context. This is something that fills a gap in knowledge, as highlighted to needing further research on UTAUT2 after reviewing the theory in 2016 from its original version of 2003. To achieve the objectives of this study, this research assumes a quantitative research approach to empirically test the hypotheses derived from existing literature and pilot studies to give statistical support to generalise the research findings for further possible applications in theory and practice. This research is explanatory or casual in nature and uses cross-section primary data collected through a survey method. Convenient and purposive sampling using structured self-administered online questionnaires is used for data collection. The proposed model is tested using Structural Equation Modelling (SEM), and the analysis of primary data collected through an online survey is processed using Smart PLS software with a sample size of 386 digital bank users. The results are expected to establish if there are significant relationships between the dependent and independent variables and establish what the most influencing factors are.

Keywords: banking applications, digital banking, financial technology, technology adoption, UTAUT2

Procedia PDF Downloads 63
6628 Smokeless Tobacco Oral Manifestation and Inflammatory Biomarkers in Saliva

Authors: Sintija Miļuna, Ričards Melderis, Loreta Briuka, Dagnija Rostoka, Ingus Skadiņš, Juta Kroiča

Abstract:

Objectives Smokeless tobacco products in Latvia become more available and favorable to young adults, especially students and athletes like hockey and floorball players. The aim of the research was to detect visual mucosal changes in the oral cavity in smokeless tobacco users and to evaluate pro - inflammatory and anti - inflammatory cytokine (IL-6, IL-1, IL-8, TNF Alpha) levels in saliva from smokeless tobacco users. Methods A smokeless tobacco group (n=10) and a control group (non-tobacco users) (n=10) were intraorally examined for oral lesions and 5 ml of saliva were collected. Saliva was analysed for Il-6, IL-1, Il-8, TNF Alpha using ELISA Sigma-Aldrich. For statistical analysis IBM Statistics 27 was used (Mann - Whitney U test, Spearman’s Rank Correlation coefficient). This research was approved by the Ethics Committee of Rīga Stradiņš University No.22/28.01.2016. This research has been developed with financing from the European Social Fund and Latvian state budget within the project no. 8.2.2.0/20/I/004 “Support for involving doctoral students in scientific research and studies” at Rīga Stradiņš University. Results IL-1, IL-6, IL-8, TNF Alpha levels were higher in the smokeless tobacco group (IL-1 83.34 pg/ml vs. 74.26 pg/ml; IL-6 195.10 pg/ml vs. 6.16 pg/ml; IL-8 736.34 pg/ml vs. 285.26 pg/ml; TNF Alpha 489.27 pg/ml vs. 200.9 pg/ml), but statistically there is no difference between control group and smokeless tobacco group (IL1 p=0.190, IL6 p=0.052, IL8 p=0.165, TNF alpha p=0.089). There was statistical correlation between IL1 and IL6 (p=0.023), IL6 and TNF alpha (p=0.028), IL8 and IL6 (p=0.005). Conclusions White localized lesions were detected in places where smokeless tobacco users placed sachets. There is a statistical correlation between IL6 and IL1 levels, IL6 and TNF alpha levels, IL8 and IL6 levels in saliva. There are no differences in the inflammatory cytokine levels between control group and smokeless tobacco group.

Keywords: smokeless tobacco, Snus, inflammatory biomarkers, oral lesions, oral pathology

Procedia PDF Downloads 130
6627 Effect of Depth on Texture Features of Ultrasound Images

Authors: M. A. Alqahtani, D. P. Coleman, N. D. Pugh, L. D. M. Nokes

Abstract:

In diagnostic ultrasound, the echo graphic B-scan texture is an important area of investigation since it can be analyzed to characterize the histological state of internal tissues. An important factor requiring consideration when evaluating ultrasonic tissue texture is the depth. The effect of attenuation with depth of ultrasound, the size of the region of interest, gain, and dynamic range are important variables to consider as they can influence the analysis of texture features. These sources of variability have to be considered carefully when evaluating image texture as different settings might influence the resultant image. The aim of this study is to investigate the effect of depth on the texture features in-vivo using a 3D ultrasound probe. The left leg medial head of the gastrocnemius muscle of 10 healthy subjects were scanned. Two regions A and B were defined at different depth within the gastrocnemius muscle boundary. The size of both ROI’s was 280*20 pixels and the distance between region A and B was kept constant at 5 mm. Texture parameters include gray level, variance, skewness, kurtosis, co-occurrence matrix; run length matrix, gradient, autoregressive (AR) model and wavelet transform were extracted from the images. The paired t –test was used to test the depth effect for the normally distributed data and the Wilcoxon–Mann-Whitney test was used for the non-normally distributed data. The gray level, variance, and run length matrix were significantly lowered when the depth increased. The other texture parameters showed similar values at different depth. All the texture parameters showed no significant difference between depths A and B (p > 0.05) except for gray level, variance and run length matrix (p < 0.05). This indicates that gray level, variance, and run length matrix are depth dependent.

Keywords: ultrasound image, texture parameters, computational biology, biomedical engineering

Procedia PDF Downloads 288
6626 Deep Learning for Image Correction in Sparse-View Computed Tomography

Authors: Shubham Gogri, Lucia Florescu

Abstract:

Medical diagnosis and radiotherapy treatment planning using Computed Tomography (CT) rely on the quantitative accuracy and quality of the CT images. At the same time, requirements for CT imaging include reducing the radiation dose exposure to patients and minimizing scanning time. A solution to this is the sparse-view CT technique, based on a reduced number of projection views. This, however, introduces a new problem— the incomplete projection data results in lower quality of the reconstructed images. To tackle this issue, deep learning methods have been applied to enhance the quality of the sparse-view CT images. A first approach involved employing Mir-Net, a dedicated deep neural network designed for image enhancement. This showed promise, utilizing an intricate architecture comprising encoder and decoder networks, along with the incorporation of the Charbonnier Loss. However, this approach was computationally demanding. Subsequently, a specialized Generative Adversarial Network (GAN) architecture, rooted in the Pix2Pix framework, was implemented. This GAN framework involves a U-Net-based Generator and a Discriminator based on Convolutional Neural Networks. To bolster the GAN's performance, both Charbonnier and Wasserstein loss functions were introduced, collectively focusing on capturing minute details while ensuring training stability. The integration of the perceptual loss, calculated based on feature vectors extracted from the VGG16 network pretrained on the ImageNet dataset, further enhanced the network's ability to synthesize relevant images. A series of comprehensive experiments with clinical CT data were conducted, exploring various GAN loss functions, including Wasserstein, Charbonnier, and perceptual loss. The outcomes demonstrated significant image quality improvements, confirmed through pertinent metrics such as Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) between the corrected images and the ground truth. Furthermore, learning curves and qualitative comparisons added evidence of the enhanced image quality and the network's increased stability, while preserving pixel value intensity. The experiments underscored the potential of deep learning frameworks in enhancing the visual interpretation of CT scans, achieving outcomes with SSIM values close to one and PSNR values reaching up to 76.

Keywords: generative adversarial networks, sparse view computed tomography, CT image correction, Mir-Net

Procedia PDF Downloads 150