Search results for: digital imaging and communications in medicine (DICOM)
4810 Digital Subsistence of Cultural Heritage: Digital Media as a New Dimension of Cultural Ecology
Authors: Dan Luo
Abstract:
With the climate change can exacerbate exposure of cultural heritage to climatic stressors, scholars pin their hope on digital technology can help the site avoid surprises. Virtual museum has been regarded as a highly effective technology that enables people to gain enjoyable visiting experience and immersive information about cultural heritage. The technology clearly reproduces the images of the tangible cultural heritage, and the aesthetic experience created by new media helps consumers escape from the realistic environment full of uncertainty. The new cultural anchor has appeared outside the cultural sites. This article synthesizes the international literature on the virtual museum by developing diagrams of Citespace focusing on the tangible cultural heritage and the alarmingly situation has emerged in the process of resolving climate change: (1) Digital collections are the different cultural assets for public. (2) The media ecology change people ways of thinking and meeting style of cultural heritage. (3) Cultural heritage may live forever in the digital world. This article provides a typical practice information to manage cultural heritage in a changing climate—the Dunhuang Mogao Grottoes in the far northwest of China, which is a worldwide cultural heritage site famous for its remarkable and sumptuous murals. This monument is a typical synthesis of art containing 735 Buddhist temples, which was listed by UNESCO as one of the World Cultural Heritage sites. The caves contain some extraordinary examples of Buddhist art spanning a period of 1,000 years - the architectural form, the sculptures in the caves, and the murals on the walls, all together constitute a wonderful aesthetic experience. Unfortunately, this magnificent treasure cave has been threatened by increasingly frequent dust storms and precipitation. The Dunhuang Academy has been using digital technology since the last century to preserve these immovable cultural heritages, especially the murals in the caves. And then, Dunhuang culture has become a new media culture after introduce the art to the world audience through exhibitions, VR, video, etc. The paper chooses qualitative research method that used Nvivo software to encode the collected material to answer this question. The author paid close attention to the survey in Dunhuang City, including participated in 10 exhibition and 20 salons that are Dunhuang-themed on network. What’s more, 308 visitors were interviewed who are fans of the art and have experienced Dunhuang culture online(6-75 years).These interviewees have been exposed to Dunhuang culture through different media, and they are acutely aware of the threat to this cultural heritage. The conclusion is that the unique halo of the cultural heritage was always emphasized, and digital media breeds twin brothers of cultural heritage. In addition, the digital media make it possible for cultural heritage to reintegrate into the daily life of the masses. Visitors gain the opportunity to imitate the mural figures through enlarged or emphasized images but also lose the perspective of understanding the whole cultural life. New media construct a new life aesthetics apart from the Authorized heritage discourse.Keywords: cultural ecology, digital twins, life aesthetics, media
Procedia PDF Downloads 824809 Blade Runner and Slavery in the 21st Century
Authors: Bülent Diken
Abstract:
This paper looks to set Ridley Scott’s original film Blade Runner (1982) and Denis Villeneuve’s Blade Runner 2049 (2017) in order to provide an analysis of both films with respect to the new configurations of slavery in the 21st century. Both Blade Runner films present a de-politicized society that oscillates between two extremes: the spectral (the eye, optics, digital communications) and the biopolitical (the body, haptics). On the one hand, recognizing the subject only as a sign, the society of the spectacle registers, identifies, produces and reproduces the subject as a code. At the same time, though, the subject is constantly reduced to a naked body, to bare life, for biometric technologies to scan it as a biological body or body parts. Being simultaneously a pure code (word without body) and an instrument slave (body without word), the replicants are thus the paradigmatic subjects of this society. The paper focuses first on the similarity: both films depict a relationship between masters and slaves, that is, a despotic relationship. The master uses the (body of the) slave as an instrument, as an extension of his own body. Blade Runner 2019 frames the despotic relation in this classical way through its triangulation with the economy (the Tyrell Corporation) and the slave-replicants’ dissent (rejecting their reduction to mere instruments). In a counter-classical approach, in Blade Runner 2049, the focus shifts to another triangulation: despotism, economy (the Wallace Corporation) and consent (of replicants who no longer perceive themselves as slaves).Keywords: Blade Runner, the spectacle, bio-politics, slavery, imstrumentalisation
Procedia PDF Downloads 694808 Shotcrete Performance Optimisation and Audit Using 3D Laser Scanning
Authors: Carlos Gonzalez, Neil Slatcher, Marcus Properzi, Kan Seah
Abstract:
In many underground mining operations, shotcrete is used for permanent rock support. Shotcrete thickness is a critical measure of the success of this process. 3D Laser Mapping, in conjunction with Jetcrete, has developed a 3D laser scanning system specifically for measuring the thickness of shotcrete. The system is mounted on the shotcrete spraying machine and measures the rock faces before and after spraying. The calculated difference between the two 3D surface models is measured as the thickness of the sprayed concrete. Typical work patterns for the shotcrete process required a rapid and automatic system. The scanning takes place immediately before and after the application of the shotcrete so no convergence takes place in the interval between scans. Automatic alignment of scans without targets was implemented which allows for the possibility of movement of the spraying machine between scans. Case studies are presented where accuracy tests are undertaken and automatic audit reports are calculated. The use of 3D imaging data for the calculation of shotcrete thickness is an important tool for geotechnical engineers and contract managers, and this could become the new state-of-the-art methodology for the mining industry.Keywords: 3D imaging, shotcrete, surface model, tunnel stability
Procedia PDF Downloads 2924807 Examining Influence of The Ultrasonic Power and Frequency on Microbubbles Dynamics Using Real-Time Visualization of Synchrotron X-Ray Imaging: Application to Membrane Fouling Control
Authors: Masoume Ehsani, Ning Zhu, Huu Doan, Ali Lohi, Amira Abdelrasoul
Abstract:
Membrane fouling poses severe challenges in membrane-based wastewater treatment applications. Ultrasound (US) has been considered an effective fouling remediation technique in filtration processes. Bubble cavitation in the liquid medium results from the alternating rarefaction and compression cycles during the US irradiation at sufficiently high acoustic pressure. Cavitation microbubbles generated under US irradiation can cause eddy current and turbulent flow within the medium by either oscillating or discharging energy to the system through microbubble explosion. Turbulent flow regime and shear forces created close to the membrane surface cause disturbing the cake layer and dislodging the foulants, which in turn improve the cleaning efficiency and filtration performance. Therefore, the number, size, velocity, and oscillation pattern of the microbubbles created in the liquid medium play a crucial role in foulant detachment and permeate flux recovery. The goal of the current study is to gain in depth understanding of the influence of the US power intensity and frequency on the microbubble dynamics and its characteristics generated under US irradiation. In comparison with other imaging techniques, the synchrotron in-line Phase Contrast Imaging technique at the Canadian Light Source (CLS) allows in-situ observation and real-time visualization of microbubble dynamics. At CLS biomedical imaging and therapy (BMIT) polychromatic beamline, the effective parameters were optimized to enhance the contrast gas/liquid interface for the accuracy of the qualitative and quantitative analysis of bubble cavitation within the system. With the high flux of photons and the high-speed camera, a typical high projection speed was achieved; and each projection of microbubbles in water was captured in 0.5 ms. ImageJ software was used for post-processing the raw images for the detailed quantitative analyses of microbubbles. The imaging has been performed under the US power intensity levels of 50 W, 60 W, and 100 W, in addition to the US frequency levels of 20 kHz, 28 kHz, and 40 kHz. For the duration of 2 seconds of imaging, the effect of the US power and frequency on the average number, size, and fraction of the area occupied by bubbles were analyzed. Microbubbles’ dynamics in terms of their velocity in water was also investigated. For the US power increase of 50 W to 100 W, the average bubble number and the average bubble diameter were increased from 746 to 880 and from 36.7 µm to 48.4 µm, respectively. In terms of the influence of US frequency, a fewer number of bubbles were created at 20 kHz (average of 176 bubbles rather than 808 bubbles at 40 kHz), while the average bubble size was significantly larger than that of 40 kHz (almost seven times). The majority of bubbles were captured close to the membrane surface in the filtration unit. According to the study observations, membrane cleaning efficiency is expected to be improved at higher US power and lower US frequency due to the higher energy release to the system by increasing the number of bubbles or growing their size during oscillation (optimum condition is expected to be at 20 kHz and 100 W).Keywords: bubble dynamics, cavitational bubbles, membrane fouling, ultrasonic cleaning
Procedia PDF Downloads 1534806 Challenges of Management of Subaortic Membrane in a Young Adult Patient: A Case Review and Literature Review
Authors: Talal Asif, Maya Kosinska, Lucas Georger, Krish Sardesai, Muhammad Shah Miran
Abstract:
This article presents a case review and literature review focused on the challenges of managing subaortic membranes (SAM) in young adult patients with mild aortic regurgitation (AR) or aortic stenosis (AS). The study aims to discuss the diagnosis of SAM, imaging studies used for assessment, management strategies in young patients, the risk of valvular damage, and the controversy surrounding prophylactic resection in mild AR. The management of SAM in adults poses challenges due to limited treatment options and potential complications, necessitating further investigation into the progression of AS and AR in asymptomatic SAM patients. The case presentation describes a 40-year-old male with muscular dystrophy who presented with symptoms and was diagnosed with SAM. Various imaging techniques, including CT chest, transthoracic echocardiogram (TTE), and transesophageal echocardiogram (TEE), were used to confirm the presence and severity of SAM. Based on the patient's clinical profile and the absence of surgical indications, medical therapy was initiated, and regular outpatient follow-up was recommended to monitor disease progression. The discussion highlights the challenges in diagnosing SAM, the importance of imaging studies, and the potential complications associated with SAM in young patients. The article also explores the management options for SAM, emphasizing surgical resection as the definitive treatment while acknowledging the limited success rates of alternative approaches. Close monitoring and prompt intervention for complications are crucial in the management of SAM. The concluding statement emphasizes the need for further research to explore alternative treatments for SAM in young patients.Keywords: subaortic membrane, management, case report, literature review, aortic regurgitation, aortic stenosis, left ventricular outflow obstruction, guidelines, heart failure
Procedia PDF Downloads 1014805 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 3514804 O-(2-18F-Fluoroethyl)-L-Tyrosine Positron Emission Tomography/Computed Tomography in Patients with Suspicious Recurrent Low and High-Grade Glioma
Authors: Mahkameh Asadi, Habibollah Dadgar
Abstract:
The precise definition margin of high and low-grade glioma is crucial for choosing best treatment approach after surgery and radio-chemotherapy. The aim of the current study was to assess the O-(2-18F-fluoroethyl)-L-tyrosine (18F-FET) positron emission tomography (PET)/computed tomography (CT) in patients with low (LGG) and high grade glioma (HGG). We retrospectively analyzed 18F-FET PET/CT of 10 patients (age: 33 ± 12 years) with suspicious for recurrent LGG and HGG. The final decision of recurrence was made by magnetic resonance imaging (MRI) and registered clinical data. While response to radio-chemotherapy by MRI is often complex and sophisticated due to the edema, necrosis, and inflammation, emerging amino acid PET leading to better interpretations with more specifically differentiate true tumor boundaries from equivocal lesions. Therefore, integrating amino acid PET in the management of glioma to complement MRI will significantly improve early therapy response assessment, treatment planning, and clinical trial design.Keywords: positron emission tomography, amino acid positron emission tomography, magnetic resonance imaging, low and high grade glioma
Procedia PDF Downloads 1774803 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics
Authors: Hongliang Zhang
Abstract:
The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.Keywords: cybertext, digital poetry, poetry generator, semiotics
Procedia PDF Downloads 1754802 Leveraging Mobile Apps for Citizen-Centric Urban Planning: Insights from Tajawob Implementation
Authors: Alae El Fahsi
Abstract:
This study explores the ‘Tajawob’ app's role in urban development, demonstrating how mobile applications can empower citizens and facilitate urban planning. Tajawob serves as a digital platform for community feedback, engagement, and participatory governance, addressing urban challenges through innovative tech solutions. This research synthesizes data from a variety of sources, including user feedback, engagement metrics, and interviews with city officials, to assess the app’s impact on citizen participation in urban development in Morocco. By integrating advanced data analytics and user experience design, Tajawob has bridged the communication gap between citizens and government officials, fostering a more collaborative and transparent urban planning process. The findings reveal a significant increase in civic engagement, with users actively contributing to urban management decisions, thereby enhancing the responsiveness and inclusivity of urban governance. Challenges such as digital literacy, infrastructure limitations, and privacy concerns are also discussed, providing a comprehensive overview of the obstacles and opportunities presented by mobile app-based citizen engagement platforms. The study concludes with strategic recommendations for scaling the Tajawob model to other contexts, emphasizing the importance of adaptive technology solutions in meeting the evolving needs of urban populations. This research contributes to the burgeoning field of smart city innovations, offering key insights into the role of digital tools in facilitating more democratic and participatory urban environments.Keywords: smart cities, digital governance, urban planning, strategic design
Procedia PDF Downloads 604801 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging
Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi
Abstract:
Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA
Procedia PDF Downloads 2814800 Digital Forgery Detection by Signal Noise Inconsistency
Authors: Bo Liu, Chi-Man Pun
Abstract:
A novel technique for digital forgery detection by signal noise inconsistency is proposed in this paper. The forged area spliced from the other picture contains some features which may be inconsistent with the rest part of the image. Noise pattern and the level is a possible factor to reveal such inconsistency. To detect such noise discrepancies, the test picture is initially segmented into small pieces. The noise pattern and level of each segment are then estimated by using various filters. The noise features constructed in this step are utilized in energy-based graph cut to expose forged area in the final step. Experimental results show that our method provides a good illustration of regions with noise inconsistency in various scenarios.Keywords: forgery detection, splicing forgery, noise estimation, noise
Procedia PDF Downloads 4624799 Investigating Best Strategies Towards Creating Alternative Assessment in Literature
Authors: Sandhya Rao Mehta
Abstract:
As ChatGpt and other Artificial Intelligence (AI) forms are becoming part of our regular academic world, the consequences are being gradually discussed. The extent to which an essay written by a student is itself of any value if it has been downloaded by some form of AI is perhaps central to this discourse. A larger question is whether writing should be taught as an academic skill at all. In literature classrooms, this has major consequences as writing a traditional paper is still the single most preferred form of assessment. This study suggests that it is imperative to investigate alternative forms of assessment in literature, not only because the existing forms can be written by AI, but in a larger sense, students are increasingly skeptical of the purpose of such work. The extent to which an essay actually helps the students professionally is a question that academia has not yet answered. This paper suggests that using real-world tasks like creating podcasts, video tutorials, and websites is a far better way to evaluate students' critical thinking and application of ideas, as well as to develop digital skills which are important to their future careers. Using the example of a course in literature, this study will examine the possibilities and challenges of creating digital projects as a way of confronting the complexities of student evaluation in the future. The study is based on a specific university English as a Foreign Language (EFL) context.Keywords: assessment, literature, digital humanities, chatgpt
Procedia PDF Downloads 874798 Young People, the Internet and Inequality: What are the Causes and Consequences of Exclusion?
Authors: Albin Wallace
Abstract:
Part of the provision within educational institutions is the design, commissioning and implementation of ICT facilities to improve teaching and learning. Inevitably, these facilities focus largely on Internet Protocol (IP) based provisions including access to the World Wide Web, email, interactive software and hardware tools. Educators should be committed to the use of ICT to improve learning and teaching as well as to issues relating to the Internet and educational disadvantage, especially with respect to access and exclusion concerns. In this paper I examine some recent research into the issue of inequality and use of the Internet during which I discuss the causes and consequences of exclusion in the context of social inequality, digital literacy and digital inequality, also touching on issues of global inequality.Keywords: inequality, internet, education, design
Procedia PDF Downloads 4894797 Decolonizing Print Culture and Bibliography Through Digital Visualizations of Artists’ Books at the University of Miami
Authors: Alejandra G. Barbón, José Vila, Dania Vazquez
Abstract:
This study seeks to contribute to the advancement of library and archival sciences in the areas of records management, knowledge organization, and information architecture, particularly focusing on the enhancement of bibliographical description through the incorporation of visual interactive designs aimed to enrich the library users’ experience. In an era of heightened awareness about the legacy of hiddenness across special and rare collections in libraries and archives, along with the need for inclusivity in academia, the University of Miami Libraries has embarked on an innovative project that intersects the realms of print culture, decolonization, and digital technology. This proposal presents an exciting initiative to revitalize the study of Artists’ Books collections by employing digital visual representations to decolonize bibliographic records of some of the most unique materials and foster a more holistic understanding of cultural heritage. Artists' Books, a dynamic and interdisciplinary art form, challenge conventional bibliographic classification systems, making them ripe for the exploration of alternative approaches. This project involves the creation of a digital platform that combines multimedia elements for digital representations, interactive information retrieval systems, innovative information architecture, trending bibliographic cataloging and metadata initiatives, and collaborative curation to transform how we engage with and understand these collections. By embracing the potential of technology, we aim to transcend traditional constraints and address the historical biases that have influenced bibliographic practices. In essence, this study showcases a groundbreaking endeavor at the University of Miami Libraries that seeks to not only enhance bibliographic practices but also confront the legacy of hiddenness across special and rare collections in libraries and archives while strengthening conventional bibliographic description. By embracing digital visualizations, we aim to provide new pathways for understanding Artists' Books collections in a manner that is more inclusive, dynamic, and forward-looking. This project exemplifies the University’s dedication to fostering critical engagement, embracing technological innovation, and promoting diverse and equitable classifications and representations of cultural heritage.Keywords: decolonizing bibliographic cataloging frameworks, digital visualizations information architecture platforms, collaborative curation and inclusivity for records management, engagement and accessibility increasing interaction design and user experience
Procedia PDF Downloads 754796 Air Handling Units Power Consumption Using Generalized Additive Model for Anomaly Detection: A Case Study in a Singapore Campus
Authors: Ju Peng Poh, Jun Yu Charles Lee, Jonathan Chew Hoe Khoo
Abstract:
The emergence of digital twin technology, a digital replica of physical world, has improved the real-time access to data from sensors about the performance of buildings. This digital transformation has opened up many opportunities to improve the management of the building by using the data collected to help monitor consumption patterns and energy leakages. One example is the integration of predictive models for anomaly detection. In this paper, we use the GAM (Generalised Additive Model) for the anomaly detection of Air Handling Units (AHU) power consumption pattern. There is ample research work on the use of GAM for the prediction of power consumption at the office building and nation-wide level. However, there is limited illustration of its anomaly detection capabilities, prescriptive analytics case study, and its integration with the latest development of digital twin technology. In this paper, we applied the general GAM modelling framework on the historical data of the AHU power consumption and cooling load of the building between Jan 2018 to Aug 2019 from an education campus in Singapore to train prediction models that, in turn, yield predicted values and ranges. The historical data are seamlessly extracted from the digital twin for modelling purposes. We enhanced the utility of the GAM model by using it to power a real-time anomaly detection system based on the forward predicted ranges. The magnitude of deviation from the upper and lower bounds of the uncertainty intervals is used to inform and identify anomalous data points, all based on historical data, without explicit intervention from domain experts. Notwithstanding, the domain expert fits in through an optional feedback loop through which iterative data cleansing is performed. After an anomalously high or low level of power consumption detected, a set of rule-based conditions are evaluated in real-time to help determine the next course of action for the facilities manager. The performance of GAM is then compared with other approaches to evaluate its effectiveness. Lastly, we discuss the successfully deployment of this approach for the detection of anomalous power consumption pattern and illustrated with real-world use cases.Keywords: anomaly detection, digital twin, generalised additive model, GAM, power consumption, supervised learning
Procedia PDF Downloads 1564795 Diagnosis of Avian Pathology in the East of Algeria
Authors: Khenenou Tarek, Benzaoui Hassina, Melizi Mohamed
Abstract:
The diagnosis requires a background of current knowledge in the field and also complementary means in which the laboratory occupies the central place for a better investigation. A correct diagnosis allows to establish the most appropriate treatment as soon as possible and avoids both the economic losses associated with mortality and growth retardation often observed in poultry furthermore it may reduce the high cost of treatment. Epedemiologic survey, hematologic and histopathologic study’s are three aspects of diagnosis heavily used in both human and veterinary pathology and the advanced researches in human medicine would be exploited to be applied in veterinary medicine with given modification .Whereas, the diagnostic methods in the east of Algeria are limited to the clinical signs and necropsy finding. Therefore, the diagnosis is based simply on the success or the failure of the therapeutic methods (therapeutic diagnosis).Keywords: chicken, diagnosis, hematology, histopathology
Procedia PDF Downloads 6314794 Assessment of Frying Material by Deep-Fat Frying Method
Authors: Brinda Sharma, Saakshi S. Sarpotdar
Abstract:
Deep-fat frying is popular standard method that has been studied basically to clarify the complicated mechanisms of fat decomposition at high temperatures and to assess their effects on human health. The aim of this paper is to point out the application of method engineering that has been recently improved our understanding of the fundamental principles and mechanisms concerned at different scales and different times throughout the process: pretreatment, frying, and cooling. It covers the several aspects of deep-fat drying. New results regarding the understanding of the frying method that are obtained as a results of major breakthroughs in on-line instrumentation (heat, steam flux, and native pressure sensors), within the methodology of microstructural and imaging analysis (NMR, MRI, SEM) and in software system tools for the simulation of coupled transfer and transport phenomena. Such advances have opened the approach for the creation of significant information of the behavior of varied materials and to the event of latest tools to manage frying operations via final product quality in real conditions. Lastly, this paper promotes an integrated approach to the frying method as well as numerous competencies like those of chemists, engineers, toxicologists, nutritionists, and materials scientists also as of the occupation and industrial sectors.Keywords: frying, cooling, imaging analysis (NMR, MRI, SEM), deep-fat frying
Procedia PDF Downloads 4304793 Treatment and Diagnostic Imaging Methods of Fetal Heart Function in Radiology
Authors: Mahdi Farajzadeh Ajirlou
Abstract:
Prior evidence of normal cardiac anatomy is desirable to relieve the anxiety of cases with a family history of congenital heart disease or to offer the option of early gestation termination or close follow-up should a cardiac anomaly be proved. Fetal heart discovery plays an important part in the opinion of the fetus, and it can reflect the fetal heart function of the fetus, which is regulated by the central nervous system. Acquisition of ventricular volume and inflow data would be useful to quantify more valve regurgitation and ventricular function to determine the degree of cardiovascular concession in fetal conditions at threat for hydrops fetalis. This study discusses imaging the fetal heart with transvaginal ultrasound, Doppler ultrasound, three-dimensional ultrasound (3DUS) and four-dimensional (4D) ultrasound, spatiotemporal image correlation (STIC), glamorous resonance imaging and cardiac catheterization. Doppler ultrasound (DUS) image is a kind of real- time image with a better imaging effect on blood vessels and soft tissues. DUS imaging can observe the shape of the fetus, but it cannot show whether the fetus is hypoxic or distressed. Spatiotemporal image correlation (STIC) enables the acquisition of a volume of data concomitant with the beating heart. The automated volume accession is made possible by the array in the transducer performing a slow single reach, recording a single 3D data set conforming to numerous 2D frames one behind the other. The volume accession can be done in a stationary 3D, either online 4D (direct volume scan, live 3D ultrasound or a so-called 4D (3D/ 4D)), or either spatiotemporal image correlation-STIC (off-line 4D, which is a circular volume check-up). Fetal cardiovascular MRI would appear to be an ideal approach to the noninvasive disquisition of the impact of abnormal cardiovascular hemodynamics on antenatal brain growth and development. Still, there are practical limitations to the use of conventional MRI for fetal cardiovascular assessment, including the small size and high heart rate of the mortal fetus, the lack of conventional cardiac gating styles to attend data accession, and the implicit corruption of MRI data due to motherly respiration and unpredictable fetal movements. Fetal cardiac MRI has the implicit to complement ultrasound in detecting cardiovascular deformations and extracardiac lesions. Fetal cardiac intervention (FCI), minimally invasive catheter interventions, is a new and evolving fashion that allows for in-utero treatment of a subset of severe forms of congenital heart deficiency. In special cases, it may be possible to modify the natural history of congenital heart disorders. It's entirely possible that future generations will ‘repair’ congenital heart deficiency in utero using nanotechnologies or remote computer-guided micro-robots that work in the cellular layer.Keywords: fetal, cardiac MRI, ultrasound, 3D, 4D, heart disease, invasive, noninvasive, catheter
Procedia PDF Downloads 434792 Lunar Exploration based on Ground-Based Radar: Current Research Progress and Future Prospects
Authors: Jiangwan Xu, Chunyu Ding
Abstract:
Lunar exploration is of significant importance in the development and utilization of in-situ lunar resources, water ice exploration, space and astronomical science, as well as in political and military strategy. In recent years, ground-based radar (GBR) has gained increasing attention in the field of lunar exploration due to its flexibility, low cost, and penetrating capabilities. This paper reviews the scientific research on lunar exploration using GBR, outlining the basic principles of GBR and the progress made in lunar exploration studies. It introduces the fundamental principles of lunar imaging using GBR, and systematically reviews studies on lunar surface layer detection, inversion of lunar regolith dielectric properties, and polar water ice detection using GBR. In particular, the paper summarizes the current development status of Chinese GBR and forecasts future development trends in China. This review will enhance the understanding of lunar exploration results using GBR radar, systematically demonstrate the main applications and scientific achievements of GBR in lunar exploration, and provide a reference for future GBR radar lunar exploration missions.Keywords: ground-based radar, lunar exploration, radar imaging, lunar surface/subsurface detection
Procedia PDF Downloads 344791 Estimating X-Ray Spectra for Digital Mammography by Using the Expectation Maximization Algorithm: A Monte Carlo Simulation Study
Authors: Chieh-Chun Chang, Cheng-Ting Shih, Yan-Lin Liu, Shu-Jun Chang, Jay Wu
Abstract:
With the widespread use of digital mammography (DM), radiation dose evaluation of breasts has become important. X-ray spectra are one of the key factors that influence the absorbed dose of glandular tissue. In this study, we estimated the X-ray spectrum of DM using the expectation maximization (EM) algorithm with the transmission measurement data. The interpolating polynomial model proposed by Boone was applied to generate the initial guess of the DM spectrum with the target/filter combination of Mo/Mo and the tube voltage of 26 kVp. The Monte Carlo N-particle code (MCNP5) was used to tally the transmission data through aluminum sheets of 0.2 to 3 mm. The X-ray spectrum was reconstructed by using the EM algorithm iteratively. The influence of the initial guess for EM reconstruction was evaluated. The percentage error of the average energy between the reference spectrum inputted for Monte Carlo simulation and the spectrum estimated by the EM algorithm was -0.14%. The normalized root mean square error (NRMSE) and the normalized root max square error (NRMaSE) between both spectra were 0.6% and 2.3%, respectively. We conclude that the EM algorithm with transmission measurement data is a convenient and useful tool for estimating x-ray spectra for DM in clinical practice.Keywords: digital mammography, expectation maximization algorithm, X-Ray spectrum, X-Ray
Procedia PDF Downloads 7324790 Lower Risk of Ischemic Stroke in Hormone Therapy Users with Use of Chinese Herbal Medicine
Authors: Shu-Hui Wen, Wei-Chuan Chang, Hsien-Chang Wu
Abstract:
Background: Little is known about the benefits and risks of use of Chinese herbal medicine (CHM) in conditions related to hormone therapy (HT) use on the risk of ischemic stroke (IS). The aim of this study is to explore the risk of IS in menopausal women treated with HT and CHM. Materials and methods: A total of 32,441 menopausal women without surgical menopause aged 40- 65 years were selected from 2003 to 2010 using the 2-million random samples of the National Health Insurance Research Database in Taiwan. According to the medication usage of HT and CHM, we divided the current and recent users into two groups: an HT use-only group (n = 4,989) and an HT/CHM group (n = 9,265). Propensity-score matching samples (4,079 pairs) were further created to deal with confounding by indication. The adjusted hazard ratios (HR) of IS during HT or CHM treatment were estimated by the robust Cox proportional hazards model. Results: The incidence rate of IS in the HT/CHM group was significantly lower than in the HT group (4.5 vs. 12.8 per 1000 person-year, p < 0.001). Multivariate analysis results indicated that additional CHM use was significant with a lower risk of IS (HR = 0.3; 95% confidence interval, 0.21-0.43). Further subgroup analyses and sensitivity analyses had similar findings. Conclusion: We found that combined use of HT and CHM was associated with a lower risk for IS than HT use only. Further study is needed to examine possible mechanism underlying this association.Keywords: Chinese herbal medicine, hormone therapy, ischemic stroke, menopause
Procedia PDF Downloads 3554789 The Findings EEG-LORETA about Epilepsy
Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi
Abstract:
Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides a very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations، Intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review The findings EEG- LORETA about epilepsy.Keywords: epilepsy, EEG, EEG-LORETA
Procedia PDF Downloads 5464788 Deep Learning to Improve the 5G NR Uplink Control Channel
Authors: Ahmed Krobba, Meriem Touzene, Mohamed Debeyche
Abstract:
The wireless communications system (5G) will provide more diverse applications and higher quality services for users compared to the long-term evolution 4G (LTE). 5G uses a higher carrier frequency, which suffers from information loss in 5G coverage. Most 5G users often cannot obtain high-quality communications due to transmission channel noise and channel complexity. Physical Uplink Control Channel (PUCCH-NR: Physical Uplink Control Channel New Radio) plays a crucial role in 5G NR telecommunication technology, which is mainly used to transmit link control information uplink (UCI: Uplink Control Information. This study based of evaluating the performance of channel physical uplink control PUCCH-NR under low Signal-to-Noise Ratios with various antenna numbers reception. We propose the artificial intelligence approach based on deep neural networks (Deep Learning) to estimate the PUCCH-NR channel in comparison with this approach with different conventional methods such as least-square (LS) and minimum-mean-square-error (MMSE). To evaluate the channel performance we use the block error rate (BLER) as an evaluation criterion of the communication system. The results show that the deep neural networks method gives best performance compared with MMSE and LSKeywords: 5G network, uplink (Uplink), PUCCH channel, NR-PUCCH channel, deep learning
Procedia PDF Downloads 884787 A Comparative Study between Digital Mammography, B Mode Ultrasound, Shear-Wave and Strain Elastography to Distinguish Benign and Malignant Breast Masses
Authors: Arjun Prakash, Samanvitha H.
Abstract:
BACKGROUND: Breast cancer is the commonest malignancy among women globally, with an estimated incidence of 2.3 million new cases as of 2020, representing 11.7% of all malignancies. As per Globocan data 2020, it accounted for 13.5% of all cancers and 10.6% of all cancer deaths in India. Early diagnosis and treatment can improve the overall morbidity and mortality, which necessitates the importance of differentiating benign from malignant breast masses. OBJECTIVE: The objective of the present study was to evaluate and compare the role of Digital Mammography (DM), B mode Ultrasound (USG), Shear Wave Elastography (SWE) and Strain Elastography (SE) in differentiating benign and malignant breast masses (ACR BI-RADS 3 - 5). Histo-Pathological Examination (HPE) was considered the Gold standard. MATERIALS & METHODS: We conducted a cross-sectional study on 53 patients with 64 breast masses over a period of 10 months. All patients underwent DM, USG, SWE and SE. These modalities were individually assessed to know their accuracy in differentiating benign and malignant masses. All Digital Mammograms were done using the Fujifilm AMULET Innovality Digital Mammography system and all Ultrasound examinations were performed on SAMSUNG RS 80 EVO Ultrasound system equipped with 2 to 9 MHz and 3 – 16 MHz linear transducers. All masses were subjected to HPE. Independent t-test and Chi-square or Fisher’s exact test were used to assess continuous and categorical variables, respectively. ROC analysis was done to assess the accuracy of diagnostic tests. RESULTS: Of 64 lesions, 51 (79.68%) were malignant and 13 (20.31%) (p < 0.0001) were benign. SE was the most specific (100%) (p < 0.0001) and USG (98%) (p < 0.0001) was the most sensitive of all the modalities. E max, E mean, E max ratio, E mean ratio and Strain Ratio of the malignant masses significantly differed from those of the benign masses. Maximum SWE value showed the highest sensitivity (88.2%) (p < 0.0001) among the elastography parameters. A combination of USG, SE and SWE had good sensitivity (86%) (p < 0.0001). CONCLUSION: A combination of USG, SE and SWE improves overall diagnostic yield in differentiating benign and malignant breast masses. Early diagnosis and treatment of breast carcinoma will reduce patient mortality and morbidity.Keywords: digital mammography, breast cancer, ultrasound, elastography
Procedia PDF Downloads 1064786 An Experimental Study of Bolt Inclination in a Composite Single Bolted Joint
Authors: Youcef Faci, Djillali Allou, Ahmed Mebtouche, Badredine Maalem
Abstract:
The inclination of the bolt in a fastened joint of composite material during a tensile test can be influenced by several parameters, including material properties, bolt diameter and length, the type of composite material being used, the size and dimensions of the bolt, bolt preload, surface preparation, the design and configuration of the joint, and finally testing conditions. These parameters should be carefully considered and controlled to ensure accurate and reliable results during tensile testing of composite materials with fastened joints. Our work focuses on the effect of the stacking sequence and the geometry of specimens. An experimental test is carried out to obtain the inclination of a bolt during a tensile test of a composite material using acoustic emission and digital image correlation. Several types of damage were obtained during load. Digital image correlation techniques permit to obtain the inclination of bolt angle value during tensile test. We concluded that the inclination of the bolt during a tensile test of a composite material can be related to the damage that occurs in the material. It can cause stress concentrations and localized deformation in the material, leading to damage such as delamination, fiber breakage, matrix cracking, and other forms of failure.Keywords: damage, digital image correlation, bolt inclination angle, joint
Procedia PDF Downloads 714785 Synthesis and Characterization of Some 1, 2, 3-Triazole Derivatives Containing the Chalcone Moiety and Evaluation for their Antimicrobial and Antioxidant Activity
Authors: Desta Gebretekle Shiferaw, Balakrishna Kalluraya
Abstract:
Triazoles are basic five-membered ring heterocycles with an unsaturated, six-delocalized electron ring system. Since the dawn of click chemistry, triazoles have represented a functional heterocyclic core that has been the foundation of medicinal chemistry. The compounds with 1,2,3-triazole rings can be used in several fields, including medicine, organic synthesis, polymer chemistry, fluorescent imaging, horticulture, and industries, to name a few. Besides that, they found it to have health applications in the prevention and reduction of the risk of diseases, such as anti-cancer, antimicrobial, antiviral, and anti-inflammatory properties. Here, we present the synthesis of twelve 1,2,3-triazolyl chalcone derivatives (4a–l), which were produced in high yields by coupling substituted aldehydes and triazolyl acetophenone (3a–d) in ethanol. The title products were characterized by physicochemical, infrared, nuclear magnetic resonance, and mass spectral methods. The in vitro tests were used to evaluate the antioxidant and antimicrobial activity of each of the prepared molecules. The preliminary assessment and 2,2-diphenyl-1-picrylhydrazyl activity of the title compounds showed significantly higher antibacterial activity and moderate-to-good antifungal and antioxidant activities compared to their standards. This work presents the synthesis of triazolyl chalcone derivatives and their biological activity. Based on the findings, these compounds could be used as lead compounds in antimicrobial and antioxidant research in the future.Keywords: antibacterial activity, antifungal activity, antioxidant activity, chalcone, 1, 2, 3-triazole
Procedia PDF Downloads 1284784 Perceptions of Cybersecurity in Government Organizations: Case Study of Bhutan
Authors: Pema Choejey, David Murray, Chun Che Fung
Abstract:
Bhutan is becoming increasingly dependent on Information and Communications Technologies (ICTs), especially the Internet for performing the daily activities of governments, businesses, and individuals. Consequently, information systems and networks are becoming more exposed and vulnerable to cybersecurity threats. This paper highlights the findings of the survey study carried out to understand the perceptions of cybersecurity implementation among government organizations in Bhutan. About 280 ICT personnel were surveyed about the effectiveness of cybersecurity implementation in their organizations. A questionnaire based on a 5 point Likert scale was used to assess the perceptions of respondents. The questions were asked on cybersecurity practices such as cybersecurity policies, awareness and training, and risk management. The survey results show that less than 50% of respondents believe that the cybersecurity implementation is effective: cybersecurity policy (40%), risk management (23%), training and awareness (28%), system development life cycle (34%); incident management (26%), and communications and operational management (40%). The findings suggest that many of the cybersecurity practices are inadequately implemented and therefore, there exist a gap in achieving a required cybersecurity posture. This study recommends government organizations to establish a comprehensive cybersecurity program with emphasis on cybersecurity policy, risk management, and awareness and training. In addition, the research study has practical implications to both government and private organizations for implementing and managing cybersecurity.Keywords: awareness and training, cybersecurity policy, risk management, security risks
Procedia PDF Downloads 3474783 The Impact of Neuroscience Knowledge on the Field of Education
Authors: Paula Andrea Segura Delgado, Martha Helena Ramírez-Bahena
Abstract:
Research on how the brain learns has a transcendental application in the educational context. It is crucial for teacher training to understand the nature of brain changes and their direct influence on learning processes. This communication is based on a literature review focused on neuroscience, neuroeducation, and the impact of digital technology on the human brain. Information was gathered from both English and Spanish language sources, using online journals, books and reports. The general objective was to analyze the role of neuroscience knowledge in enriching our understanding of the learning process. In fact, the authors have focused on the impact of digital technology on the human brain as well as its influence in the field of education..Neuroscience knowledge can contribute significantly to improving the training of educators and therefore educational practices. Education as an instrument of change and school as an agent of socialization, it is necessary to understand what it aims to transform: the human brain. Understanding the functioning of the human brain has important repercussions on education: this elucidates cognitive skills, psychological processes and elements that influence the learning process (memory, executive functions, emotions and the circadian cycle); helps identify psychological and neurological deficits that can impede learning processes (dyslexia, autism, hyperactivity); It allows creating environments that promote brain development and contribute to the advancement of brain capabilities in alignment with the stages of neurobiological development. The digital age presents diverse opportunities to every social environment. The frequent use of digital technology (DT) has had a significant and abrupt impact on both the cognitive abilities and physico-chemical properties of the brain, significantly influencing educational processes. Hence, educational community, with the insights from advances in neuroscience, aspire to identify the positive and negative effects of digital technology on the human brain. This knowledge helps ensure the alignment of teacher training and practices with these findings. The knowledge of neuroscience enables teachers to develop teaching methods that are aligned with the way the brain works. For example, neuroscience research has shown that digital technology is having a significant impact on the human brain (addition, anxiety, high levels of dopamine, circadian cycle disorder, decrease in attention, memory, concentration, problems with their social relationships). Therefore, it is important to understand the nature of these changes, their impact on the learning process, and how educators should effectively adapt their approaches based on these brain's changes.Keywords: digital technology, learn process, neuroscience knowledge, neuroeducation, training proffesors
Procedia PDF Downloads 624782 Comparison Between a Droplet Digital PCR and Real Time PCR Method in Quantification of HBV DNA
Authors: Surangrat Srisurapanon, Chatchawal Wongjitrat, Navin Horthongkham, Ruengpung Sutthent
Abstract:
HBV infection causes a potential serious public health problem. The ability to detect the HBV DNA concentration is of the importance and improved continuously. By using quantitative Polymerase Chain Reaction (qPCR), several factors in standardized; source of material, calibration standard curve and PCR efficiency are inconsistent. Digital PCR (dPCR) is an alternative PCR-based technique for absolute quantification using Poisson's statistics without requiring a standard curve. Therefore, the aim of this study is to compare the data set of HBV DNA generated between dPCR and qPCR methods. All samples were quantified by Abbott’s real time PCR and 54 samples with 2 -6 log10 HBV DNA were selected for comparison with dPCR. Of these 54 samples, there were two outlier samples defined as negative by dPCR. Of these two, samples were defined as negative by dPCR, whereas 52 samples were positive by both the tests. The difference between the two assays was less than 0.25 log IU/mL in 24/52 samples (46%) of paired samples; less than 0.5 log IU/mL in 46/52 samples (88%) and less than 1 log in 50/52 samples (96%). The correlation coefficient was r=0.788 and P-value <0.0001. Comparison to qPCR, data generated by dPCR tend to be the overestimation in the sample with low HBV DNA concentration and underestimated in the sample with high viral load. The variation in DNA by dPCR measurement might be due to the pre-amplification bias, template. Moreover, a minor drawback of dPCR is the large quantity of DNA had to be used when compare to the qPCR. Since the technology is relatively new, the limitations of this assay will be improved.Keywords: hepatitis B virus, real time PCR, digital PCR, DNA quantification
Procedia PDF Downloads 4824781 Emerging Cyber Threats and Cognitive Vulnerabilities: Cyberterrorism
Authors: Oludare Isaac Abiodun, Esther Omolara Abiodun
Abstract:
The purpose of this paper is to demonstrate that cyberterrorism is existing and poses a threat to computer security and national security. Nowadays, people have become excitedly dependent upon computers, phones, the Internet, and the Internet of things systems to share information, communicate, conduct a search, etc. However, these network systems are at risk from a different source that is known and unknown. These network systems risk being caused by some malicious individuals, groups, organizations, or governments, they take advantage of vulnerabilities in the computer system to hawk sensitive information from people, organizations, or governments. In doing so, they are engaging themselves in computer threats, crime, and terrorism, thereby making the use of computers insecure for others. The threat of cyberterrorism is of various forms and ranges from one country to another country. These threats include disrupting communications and information, stealing data, destroying data, leaking, and breaching data, interfering with messages and networks, and in some cases, demanding financial rewards for stolen data. Hence, this study identifies many ways that cyberterrorists utilize the Internet as a tool to advance their malicious mission, which negatively affects computer security and safety. One could identify causes for disparate anomaly behaviors and the theoretical, ideological, and current forms of the likelihood of cyberterrorism. Therefore, for a countermeasure, this paper proposes the use of previous and current computer security models as found in the literature to help in countering cyberterrorismKeywords: cyberterrorism, computer security, information, internet, terrorism, threat, digital forensic solution
Procedia PDF Downloads 97