Search results for: magnetic data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26072

Search results for: magnetic data

25112 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 75
25111 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies

Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala

Abstract:

The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.

Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies

Procedia PDF Downloads 103
25110 Fire Effects on Soil Properties of Meshchera Plain, Russia

Authors: Anna Tsibart, Timur Koshovskii

Abstract:

The properties of soils affected by the wildfires of 2002, 2010, and 2012 in Meshchera plain (Moscow region, Russia) were considered in a current research. The formation of ash horizons instead of organic peat horizons was detected both in histosols and histic podzols. The increase of pH and magnetic susceptibility was observed in soil profiles. Significant burning out of organic matter was observed, but already two years after the fire the new stage of organic matter accumulation started.

Keywords: wildfires, peat soils, organic matter, Meshchera plain

Procedia PDF Downloads 652
25109 Federated Learning in Healthcare

Authors: Ananya Gangavarapu

Abstract:

Convolutional Neural Networks (CNN) based models are providing diagnostic capabilities on par with the medical specialists in many specialty areas. However, collecting the medical data for training purposes is very challenging because of the increased regulations around data collections and privacy concerns around personal health data. The gathering of the data becomes even more difficult if the capture devices are edge-based mobile devices (like smartphones) with feeble wireless connectivity in rural/remote areas. In this paper, I would like to highlight Federated Learning approach to mitigate data privacy and security issues.

Keywords: deep learning in healthcare, data privacy, federated learning, training in distributed environment

Procedia PDF Downloads 138
25108 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 112
25107 Survey on Data Security Issues Through Cloud Computing Amongst Sme’s in Nairobi County, Kenya

Authors: Masese Chuma Benard, Martin Onsiro Ronald

Abstract:

Businesses have been using cloud computing more frequently recently because they wish to take advantage of its advantages. However, employing cloud computing also introduces new security concerns, particularly with regard to data security, potential risks and weaknesses that could be exploited by attackers, and various tactics and strategies that could be used to lessen these risks. This study examines data security issues on cloud computing amongst sme’s in Nairobi county, Kenya. The study used the sample size of 48, the research approach was mixed methods, The findings show that data owner has no control over the cloud merchant's data management procedures, there is no way to ensure that data is handled legally. This implies that you will lose control over the data stored in the cloud. Data and information stored in the cloud may face a range of availability issues due to internet outages; this can represent a significant risk to data kept in shared clouds. Integrity, availability, and secrecy are all mentioned.

Keywords: data security, cloud computing, information, information security, small and medium-sized firms (SMEs)

Procedia PDF Downloads 81
25106 Cloud Design for Storing Large Amount of Data

Authors: M. Strémy, P. Závacký, P. Cuninka, M. Juhás

Abstract:

Main goal of this paper is to introduce our design of private cloud for storing large amount of data, especially pictures, and to provide good technological backend for data analysis based on parallel processing and business intelligence. We have tested hypervisors, cloud management tools, storage for storing all data and Hadoop to provide data analysis on unstructured data. Providing high availability, virtual network management, logical separation of projects and also rapid deployment of physical servers to our environment was also needed.

Keywords: cloud, glusterfs, hadoop, juju, kvm, maas, openstack, virtualization

Procedia PDF Downloads 350
25105 Estimation of Missing Values in Aggregate Level Spatial Data

Authors: Amitha Puranik, V. S. Binu, Seena Biju

Abstract:

Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.

Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis

Procedia PDF Downloads 375
25104 Association Rules Mining and NOSQL Oriented Document in Big Data

Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub

Abstract:

Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.

Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL

Procedia PDF Downloads 156
25103 Engineering the Topological Insulator Structures for Terahertz Detectors

Authors: M. Marchewka

Abstract:

The article is devoted to the possible optical transitions in double quantum wells system based on HgTe/HgCd(Mn)Te heterostructures. Such structures can find applications as detectors and sources of radiation in the terahertz range. The Double Quantum Wells (DQW) systems consist of two QWs separated by the transparent for electrons barrier. Such systems look promising from the point of view of the additional degrees of freedom. In the case of the topological insulator in about 6.4nm wide HgTe QW or strained 3D HgTe films at the interfaces, the topologically protected surface states appear at the interfaces/surfaces. Electrons in those edge states move along the interfaces/surfaces without backscattering due to time-reversal symmetry. Combination of the topological properties, which was already verified by the experimental way, together with the very well know properties of the DQWs, can be very interesting from the applications point of view, especially in the THz area. It is important that at the present stage, the technology makes it possible to create high-quality structures of this type, and intensive experimental and theoretical studies of their properties are already underway. The idea presented in this paper is based on the eight-band KP model, including the additional terms related to the structural inversion asymmetry, interfaces inversion asymmetry, the influence of the magnetically content, and the uniaxial strain describe the full pictures of the possible real structure. All of this term, together with the external electric field, can be sources of breaking symmetry in investigated materials. Using the 8 band KP model, we investigated the electronic shape structure with and without magnetic field from the application point of view as a THz detector in a small magnetic field (below 2T). We believe that such structures are the way to get the tunable topological insulators and the multilayer topological insulator. Using the one-dimensional electrons at the topologically protected interface states as fast and collision-free signal carriers as charge and signal carriers, the detection of the optical signal should be fast, which is very important in the high-resolution detection of signals in the THz range. The proposed engineering of the investigated structures is now one of the important steps on the way to get the proper structures with predicted properties.

Keywords: topological insulator, THz spectroscopy, KP model, II-VI compounds

Procedia PDF Downloads 115
25102 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia

Authors: Melaku Tsehay

Abstract:

The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.

Keywords: data quality, immunization, verification factor, pastoralist region

Procedia PDF Downloads 110
25101 Synthesis of Core-Shell Particle Colloidal Solutions for Imaging Processes

Authors: Yoshio Kobayashi, Kohsuke Gonda

Abstract:

In medical diagnostics, contrast agents are used to improve the performance of X-ray, magnetic resonance, and fluorescence-based imaging. A variety of contrast agents are commercially available. Typical commercial contrast agents include solutions containing iodine compounds for X-ray imaging and gadolinium complexes for magnetic resonance imaging. Metal-gold nanoparticles can also exhibit X-ray imaging capabilities. Because of their small size, these contrast agents are not strongly attracted to liquids. Therefore, they cannot remain in vivo for long periods of time, making it difficult to obtain stable images. Forming contrast agent particles and increasing their apparent particle size is expected to be a solution to this problem, as they have a larger projected area than molecules or nanoparticles and thus have a longer residence time. In addition, contrast agents can cause adverse reactions derived from iodine, metallic gold, gadolinium ions, and cadmium. Coating particles with a shell that is inert to living organisms is a candidate for suppressing side reactions because the particles cannot contact the organism. Our laboratory has recently developed a method for the preparation of colloidal solutions of core-shell particles with an imaging-competent material as the core and biologically inert silica as the shell and studied their imaging properties. The method used was based on the hydrolysis and condensation of silicon alkoxide in the presence of particles such as iodine compound nanoparticles prepared by mixing aqueous AgClO4 and KI solutions, metal Au nanoparticles prepared by reducing HAuCl4 with citric acid, gadolinium compound nanoparticles prepared by the homogeneous precipitation method, and commercial cadmium compound nanoparticles so that silica nuclei generated from silicone alkoxide deposited on the particles to form silica shells. The prepared particle colloidal solutions showed notable imaging capabilities. For example, a computed tomography value of the silica-coated Au nanoparticle colloidal solution was higher than that of a commercial X-ray contrast agent for the same Au or iodine concentrations. In this contribution, we present our studies on the imaging capability of particle colloidal solutions that we have been conducting for the past few years.

Keywords: core-shell, particles, colloid, imaging

Procedia PDF Downloads 18
25100 Identifying Critical Success Factors for Data Quality Management through a Delphi Study

Authors: Maria Paula Santos, Ana Lucas

Abstract:

Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.

Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort

Procedia PDF Downloads 211
25099 Imaging Spectrum of Central Nervous System Tuberculosis on Magnetic Resonance Imaging: Correlation with Clinical and Microbiological Results

Authors: Vasundhara Arora, Anupam Jhobta, Suresh Thakur, Sanjiv Sharma

Abstract:

Aims and Objectives: Intracranial tuberculosis (TB) is one of the most devastating manifestations of TB and a challenging public health issue of considerable importance and magnitude world over. This study elaborates on the imaging spectrum of neurotuberculosis on magnetic resonance imaging (MRI) in 29 clinically suspected cases from a tertiary care hospital. Materials and Methods: The prospective hospital based evaluation of MR imaging features of neuro-tuberculosis in 29 clinically suspected cases was carried out in Department of Radio-diagnosis, Indira Gandhi Medical Hospital from July 2017 to August 2018. MR Images were obtained on a 1.5 T Magnetom Avanto machine and were analyzed to identify any abnormal meningeal enhancement or parenchymal lesions. Microbiological and Biochemical CSF analysis was performed in radio-logically suspected cases and the results were compared with the imaging data. Clinical follow up of the patients started on anti-tuberculous treatment was done to evaluate the response to treatment and clinical outcome. Results: Age range of patients in the study was between 1 year to 73 years. The mean age of presentation was 11.5 years. No significant difference in the distribution of cerebral tuberculosis was noted among the two genders. Imaging findings of neuro-tuberculosis obtained were varied and non specific ranging from lepto-meningeal enhancement, cerebritis to space occupying lesions such as tuberculomas and tubercular abscesses. Complications presenting as hydrocephalus (n= 7) and infarcts (n=9) was noted in few of these patients. 29 patients showed radiological suspicion of CNS tuberculosis with meningitis alone observed in 11 cases, tuberculomas alone were observed in 4 cases, meningitis with parenchymal tuberculomas in 11 cases. Tubercular abscess and cerebritis were observed in one case each. Tuberculous arachnoiditis was noted in one patient. Gene expert positivity was obtained in 11 out of 29 radiologically suspected patients; none of the patients showed culture positivity. Meningeal form of the disease alone showed higher positivity rate of gene Xpert (n=5) followed by combination of meningeal and parenchymal forms of disease (n=4). The parenchymal manifestation of disease alone showed least positivity rates (n= 3) with gene xpert testing. All 29 patients were started on anti tubercular treatment based on radiological suspicion of the disease with clinical improvement observed in 27 treated patients. Conclusions: In our study, higher incidence of neuro- tuberculosis was noted in paediatric population with predominance of the meningeal form of the disease. Gene Xpert positivity obtained was low due to paucibacillary nature of cerebrospinal fluid (CSF) with even lower positivity of CSF samples in parenchymal form of the manifestation. MRI showed high accuracy in detecting CNS lesions in neuro-tuberculosis. Hence, it can be concluded that MRI plays a crucial role in the diagnosis because of its inherent sensitivity and specificity and is an indispensible imaging modality. It caters to the need of early diagnosis owing to poor sensitivity of microbiological tests more so in the parenchymal manifestation of the disease.

Keywords: neurotuberculosis, tubercular abscess, tuberculoma, tuberculous meningitis

Procedia PDF Downloads 166
25098 Conductivity-Depth Inversion of Large Loop Transient Electromagnetic Sounding Data over Layered Earth Models

Authors: Ravi Ande, Mousumi Hazari

Abstract:

One of the common geophysical techniques for mapping subsurface geo-electrical structures, extensive hydro-geological research, and engineering and environmental geophysics applications is the use of time domain electromagnetic (TDEM)/transient electromagnetic (TEM) soundings. A large transmitter loop for energising the ground and a small receiver loop or magnetometer for recording the transient voltage or magnetic field in the air or on the surface of the earth, with the receiver at the center of the loop or at any random point inside or outside the source loop, make up a large loop TEM system. In general, one can acquire data using one of the configurations with a large loop source, namely, with the receiver at the center point of the loop (central loop method), at an arbitrary in-loop point (in-loop method), coincident with the transmitter loop (coincidence-loop method), and at an arbitrary offset loop point (offset-loop method), respectively. Because of the mathematical simplicity associated with the expressions of EM fields, as compared to the in-loop and offset-loop systems, the central loop system (for ground surveys) and coincident loop system (for ground as well as airborne surveys) have been developed and used extensively for the exploration of mineral and geothermal resources, for mapping contaminated groundwater caused by hazardous waste and thickness of permafrost layer. Because a proper analytical expression for the TEM response over the layered earth model for the large loop TEM system does not exist, the forward problem used in this inversion scheme is first formulated in the frequency domain and then it is transformed in the time domain using Fourier cosine or sine transforms. Using the EMLCLLER algorithm, the forward computation is initially carried out in the frequency domain. As a result, the EMLCLLER modified the forward calculation scheme in NLSTCI to compute frequency domain answers before converting them to the time domain using Fourier Cosine and/or Sine transforms.

Keywords: time domain electromagnetic (TDEM), TEM system, geoelectrical sounding structure, Fourier cosine

Procedia PDF Downloads 88
25097 Synthesis of Size-Tunable and Stable Iron Nanoparticles for Cancer Treatment

Authors: Ambika Selvaraj

Abstract:

Magnetic iron oxide nanoparticles (IO) of < 20nm (superparamagnetic) become promising tool in cancer therapy, and integrated nanodevices for cancer detection and screening. The obstacles include particle heterogeneity and cost. It can be overcome by developing monodispersed nanoparticles in economical approach. We have successfully synthesized < 7 nm IO by low temperature controlled technique, in which Fe0 is sandwiched between stabilizer and Fe2+. Size analysis showed the excellent size control from 31 nm at 33°C to 6.8 nm at 10°C. Resultant monodispersed IO were found to be stable for > 50 reuses, proved its applicability in biomedical applications.

Keywords: low temperature synthesis, hybrid iron nanoparticles, cancer therapy, biomedical applications

Procedia PDF Downloads 336
25096 Improved Mutual Inductance of Rogowski Coil Using Hexagonal Core

Authors: S. Al-Sowayan

Abstract:

Rogowski coils are increasingly used for measurement of AC and transient electric currents. Mostly used Rogowski coils now are with circular or rectangular cores. In order to increase the sensitivity of the measurement of Rogowski coil and perform smooth wire winding, this paper studies the effect of increasing the mutual inductance in order to increase the coil sensitivity by presenting the calculation and simulation of a Rogowski coil with equilateral hexagonal shaped core and comparing the resulted mutual inductance with commonly used core shapes.

Keywords: Rogowski coil, mutual inductance, magnetic flux density, communication engineering

Procedia PDF Downloads 363
25095 X-Ray Energy Release in the Solar Eruptive Flare from 6th of September 2012

Authors: Mirabbos Mirkamalov, Zavkiddin Mirtoshev

Abstract:

The M 1.6 class flare occurred on 6th of September 2012. Our observations correspond to the active region NOAA 11560 with the heliographic coordinates N04W71. The event took place between 04:00 UT and 04:45 UT, and was close to the solar limb at the western region. The flare temperature correlates with flux peak, increases for a short period (between 04:08 UT and 04:12 UT), rises impulsively, attains a maximum value of about 17 MK at 04:12 UT and gradually decreases after peak value. Around the peak we observe significant emissions of X-ray sources. Flux profiles of the X-ray emission exhibit a progressively faster raise and decline as the higher energy channels are considered.

Keywords: magnetic reconnection, solar atmosphere, solar flare, X-ray emission

Procedia PDF Downloads 319
25094 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine

Authors: Djamila Benhaddouche, Abdelkader Benyettou

Abstract:

In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.

Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction

Procedia PDF Downloads 550
25093 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease

Authors: Usama Ahmed

Abstract:

Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.

Keywords: data mining, classification, diabetes, WEKA

Procedia PDF Downloads 142
25092 Identification of Clinical Characteristics from Persistent Homology Applied to Tumor Imaging

Authors: Eashwar V. Somasundaram, Raoul R. Wadhwa, Jacob G. Scott

Abstract:

The use of radiomics in measuring geometric properties of tumor images such as size, surface area, and volume has been invaluable in assessing cancer diagnosis, treatment, and prognosis. In addition to analyzing geometric properties, radiomics would benefit from measuring topological properties using persistent homology. Intuitively, features uncovered by persistent homology may correlate to tumor structural features. One example is necrotic cavities (corresponding to 2D topological features), which are markers of very aggressive tumors. We develop a data pipeline in R that clusters tumors images based on persistent homology is used to identify meaningful clinical distinctions between tumors and possibly new relationships not captured by established clinical categorizations. A preliminary analysis was performed on 16 Magnetic Resonance Imaging (MRI) breast tissue segments downloaded from the 'Investigation of Serial Studies to Predict Your Therapeutic Response with Imaging and Molecular Analysis' (I-SPY TRIAL or ISPY1) collection in The Cancer Imaging Archive. Each segment represents a patient’s breast tumor prior to treatment. The ISPY1 dataset also provided the estrogen receptor (ER), progesterone receptor (PR), and human epidermal growth factor receptor 2 (HER2) status data. A persistent homology matrix up to 2-dimensional features was calculated for each of the MRI segmentation. Wasserstein distances were then calculated between all pairwise tumor image persistent homology matrices to create a distance matrix for each feature dimension. Since Wasserstein distances were calculated for 0, 1, and 2-dimensional features, three hierarchal clusters were constructed. The adjusted Rand Index was used to see how well the clusters corresponded to the ER/PR/HER2 status of the tumors. Triple-negative cancers (negative status for all three receptors) significantly clustered together in the 2-dimensional features dendrogram (Adjusted Rand Index of .35, p = .031). It is known that having a triple-negative breast tumor is associated with aggressive tumor growth and poor prognosis when compared to non-triple negative breast tumors. The aggressive tumor growth associated with triple-negative tumors may have a unique structure in an MRI segmentation, which persistent homology is able to identify. This preliminary analysis shows promising results in the use of persistent homology on tumor imaging to assess the severity of breast tumors. The next step is to apply this pipeline to other tumor segment images from The Cancer Imaging Archive at different sites such as the lung, kidney, and brain. In addition, whether other clinical parameters, such as overall survival, tumor stage, and tumor genotype data are captured well in persistent homology clusters will be assessed. If analyzing tumor MRI segments using persistent homology consistently identifies clinical relationships, this could enable clinicians to use persistent homology data as a noninvasive way to inform clinical decision making in oncology.

Keywords: cancer biology, oncology, persistent homology, radiomics, topological data analysis, tumor imaging

Procedia PDF Downloads 131
25091 Nonlinear Triad Interactions in Magnetohydrodynamic Plasma Turbulence

Authors: Yasser Rammah, Wolf-Christian Mueller

Abstract:

Nonlinear triad interactions in incompressible three-dimensional magnetohydrodynamic (3D-MHD) turbulence are studied by analyzing data from high-resolution direct numerical simulations of decaying isotropic (5123 grid points) and forced anisotropic (10242 x256 grid points) turbulence. An accurate numerical approach toward analyzing nonlinear turbulent energy transfer function and triad interactions is presented. It involves the direct numerical examination of every wavenumber triad that is associated with the nonlinear terms in the differential equations of MHD in the inertial range of turbulence. The technique allows us to compute the spectral energy transfer and energy fluxes, as well as the spectral locality property of energy transfer function. To this end, the geometrical shape of each underlying wavenumber triad that contributes to the statistical transfer density function is examined to infer the locality of the energy transfer. Results show that the total energy transfer is local via nonlocal triad interactions in decaying macroscopically isotropic MHD turbulence. In anisotropic MHD, turbulence subject to a strong mean magnetic field the nonlinear transfer is generally weaker and exhibits a moderate increase of nonlocality in both perpendicular and parallel directions compared to the isotropic case. These results support the recent mathematical findings, which also claim the locality of nonlinear energy transfer in MHD turbulence.

Keywords: magnetohydrodynamic (MHD) turbulence, transfer density function, locality function, direct numerical simulation (DNS)

Procedia PDF Downloads 381
25090 Comprehensive Study of Data Science

Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly

Abstract:

Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.

Keywords: data science, machine learning, data analytics, artificial intelligence

Procedia PDF Downloads 78
25089 Atomistic Study of Structural and Phases Transition of TmAs Semiconductor, Using the FPLMTO Method

Authors: Rekab Djabri Hamza, Daoud Salah

Abstract:

We report first-principles calculations of structural and magnetic properties of TmAs compound in zinc blende(B3) and CsCl(B2), structures employing the density functional theory (DFT) within the local density approximation (LDA). We use the full potential linear muffin-tin orbitals (FP-LMTO) as implemented in the LMTART-MINDLAB code (Calculation). Results are given for lattice parameters (a), bulk modulus (B), and its first derivatives(B’) in the different structures NaCl (B1) and CsCl (B2). The most important result in this work is the prediction of the possibility of transition; from cubic rocksalt (NaCl)→ CsCl (B2) (32.96GPa) for TmAs. These results use the LDA approximation.

Keywords: LDA, phase transition, properties, DFT

Procedia PDF Downloads 112
25088 Synthesis of Iron Oxide Nanoparticles Using Different Stabilizers and Study of Their Size and Properties

Authors: Mohammad Hassan Ramezan zadeh 1 , Majid Seifi 2 , Hoda Hekmat ara 2 1Biomedical Engineering Department, Near East University, Nicosia, Cyprus 2Physics Department, Guilan University , P.O. Box 41335-1914, Rasht, Iran.

Abstract:

Magnetic nano particles of ferric chloride were synthesised using a co-precipitation technique. For the optimal results, ferric chloride at room temperature was added to different surfactant with different ratio of metal ions/surfactant. The samples were characterised using transmission electron microscopy, X-ray diffraction and Fourier transform infrared spectrum to show the presence of nanoparticles, structure and morphology. Magnetic measurements were also carried out on samples using a Vibrating Sample Magnetometer. To show the effect of surfactant on size distribution and crystalline structure of produced nanoparticles, surfactants with various charge such as anionic cetyl trimethyl ammonium bromide (CTAB), cationic sodium dodecyl sulphate (SDS) and neutral TritonX-100 was employed. By changing the surfactant and ratio of metal ions/surfactant the size and crystalline structure of these nanoparticles were controlled. We also show that using anionic stabilizer leads to smallest size and narrowest size distribution and the most crystalline (polycrystalline) structure. In developing our production technique, many parameters were varied. Efforts at reproducing good yields indicated which of the experimental parameters were the most critical and how carefully they had to be controlled. The conditions reported here were the best that we encountered but the range of possible parameter choice is so large that these probably only represent a local optimum. The samples for our chemical process were prepared by adding 0.675 gr ferric chloride (FeCl3, 6H2O) to three different surfactant in water solution. The solution was sonicated for about 30 min until a transparent solution was achieved. Then 0.5 gr sodium hydroxide (NaOH) as a reduction agent was poured to the reaction drop by drop which resulted to participate reddish brown Fe2O3 nanoparticles. After washing with ethanol the obtained powder was calcinated in 600°C for 2h. Here, the sample 1 contained CTAB as a surfactant with ratio of metal ions/surfactant 1/2, sample 2 with CTAB and ratio 1/1, sample 3 with SDS and ratio 1/2, sample 4 SDS 1/1, sample 5 is triton-X-100 with 1/2 and sample 6 triton-X-100 with 1/1.

Keywords: iron oxide nanoparticles, stabilizer, co-precipitation, surfactant

Procedia PDF Downloads 246
25087 Synthesis and Characterisation of Bi-Substituted Magnetite Nanoparticles by Mechanochemical Processing (MCP)

Authors: Morteza Mohri Esfahani, Amir S. H. Rozatian, Morteza Mozaffari

Abstract:

Single phase magnetite nanoparticles and Bi-substituted ones were prepared by mechanochemical processing (MCP). The effects of Bi-substitution on the structural and magnetic properties of the nanoparticles were studied by X-ray Diffraction (XRD) and magnetometry techniques, respectively. The XRD results showed that all samples have spinel phase and by increasing Bi content, the main diffraction peaks were shifted to higher angles, which means the lattice parameter decreases from 0.843 to 0.838 nm and then increases to 0.841 nm. Also, the results revealed that increasing Bi content lead to a decrease in saturation magnetization (Ms) from 74.9 to 48.8 emu/g and an increase in coercivity (Hc) from 96.8 to 137.1 Oe.

Keywords: bi-substituted magnetite nanoparticles, mechanochemical processing, X-ray diffraction, magnetism

Procedia PDF Downloads 530
25086 Immiscible Polymer Blends with Controlled Nanoparticle Location for Excellent Microwave Absorption: A Compartmentalized Approach

Authors: Sourav Biswas, Goutam Prasanna Kar, Suryasarathi Bose

Abstract:

In order to obtain better materials, control in the precise location of nanoparticles is indispensable. It was shown here that ordered arrangement of nanoparticles, possessing different characteristics (electrical/magnetic dipoles), in the blend structure can result in excellent microwave absorption. This is manifested from a high reflection loss of ca. -67 dB for the best blend structure designed here. To attenuate electromagnetic radiations, the key parameters i.e. high electrical conductivity and large dielectric/magnetic loss are targeted here using a conducting inclusion [multiwall carbon nanotubes, MWNTs]; ferroelectric nanostructured material with associated relaxations in the GHz frequency [barium titanate, BT]; and a loss ferromagnetic nanoparticles [nickel ferrite, NF]. In this study, bi-continuous structures were designed using 50/50 (by wt) blends of polycarbonate (PC) and polyvinylidene fluoride (PVDF). The MWNTs was modified using an electron acceptor molecule; a derivative of perylenediimide, which facilitates π-π stacking with the nanotubes and stimulates efficient charge transport in the blends. The nanoscopic materials have specific affinity towards the PVDF phase. Hence, by introducing surface-active groups, ordered arrangement can be tailored. To accomplish this, both BT and NF was first hydroxylated followed by introducing amine-terminal groups on the surface. The latter facilitated in nucleophilic substitution reaction with PC and resulted in their precise location. In this study, we have shown for the first time that by compartmentalized approach, superior EM attenuation can be achieved. For instance, when the nanoparticles were localized exclusively in the PVDF phase or in both the phases, the minimum reflection loss was ca. -18 dB (for MWNT/BT mixture) and -29 dB (for MWNT/NF mixture), and the shielding was primarily through reflection. Interestingly, by adopting the compartmentalized approach where in, the lossy materials were in the PC phase and the conducting inclusion (MWNT) in PVDF, an outstanding reflection loss of ca. -57 dB (for BT and MWNT combination) and -67 dB (for NF and MWNT combination) was noted and the shielding was primarily through absorption. Thus, the approach demonstrates that nanoscopic structuring in the blends can be achieved under macroscopic processing conditions and this strategy can further be explored to design microwave absorbers.

Keywords: barium titanate, EMI shielding, MWNTs, nickel ferrite

Procedia PDF Downloads 444
25085 Application of Artificial Neural Network Technique for Diagnosing Asthma

Authors: Azadeh Bashiri

Abstract:

Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.

Keywords: asthma, data mining, Artificial Neural Network, intelligent system

Procedia PDF Downloads 271
25084 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 193
25083 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course

Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu

Abstract:

This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.

Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN

Procedia PDF Downloads 41