Search results for: real-time data visualization
25544 The Visualization of Hydrological and Hydraulic Models Based on the Platform of Autodesk Civil 3D
Authors: Xiyue Wang, Shaoning Yan
Abstract:
Cities in China today is faced with an increasingly serious river ecological crisis accompanying with the development of urbanization: waterlogging on account of the fragmented urban natural hydrological system; the limited ecological function of the hydrological system caused by a destruction of water system and waterfront ecological environment. Additionally, the eco-hydrological processes of rivers are affected by various environmental factors, which are more complex in the context of urban environment. Therefore, efficient hydrological monitoring and analysis tools, accurate and visual hydrological and hydraulic models are becoming more important basis for decision-makers and an important way for landscape architects to solve urban hydrological problems, formulating sustainable and forward-looking schemes. The study mainly introduces the river and flood analysis model based on the platform of Autodesk Civil 3D. Taking the Luanhe River in Qian'an City of Hebei Province as an example, the 3D models of the landform, river, embankment, shoal, pond, underground stream and other land features were initially built, with which the water transfer simulation analysis, river floodplain analysis, and river ecology analysis were carried out, ultimately the real-time visualized simulation and analysis of rivers in various hypothetical scenarios were realized. Through the establishment of digital hydrological and hydraulic model, the hydraulic data can be accurately and intuitively simulated, which provides basis for rational water system and benign urban ecological system design. Though, the hydrological and hydraulic model based on Autodesk Civil3D own its boundedness: the interaction between the model and other data and software is unfavorable; the huge amount of 3D data and the lack of basic data restrict the accuracy and application range. The hydrological and hydraulic model based on Autodesk Civil3D platform provides more possibility to access convenient and intelligent tool for urban planning and monitoring, a solid basis for further urban research and design.Keywords: visualization, hydrological and hydraulic model, Autodesk Civil 3D, urban river
Procedia PDF Downloads 29925543 TARF: Web Toolkit for Annotating RNA-Related Genomic Features
Abstract:
Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.Keywords: RNA-related genomic features, annotation, visualization, web server
Procedia PDF Downloads 21225542 Cartographic Depiction and Visualization of Wetlands Changes in the North-Western States of India
Authors: Bansal Ashwani
Abstract:
Cartographic depiction and visualization of wetland changes is an important tool to map spatial-temporal information about the wetland dynamics effectively and to comprehend the response of these water bodies in maintaining the groundwater and surrounding ecosystem. This is true for the states of North Western India, i.e., J&K, Himachal, Punjab, and Haryana that are bestowed upon with several natural wetlands in the flood plains or on the courses of its rivers. Thus, the present study documents, analyses and reconstructs the lost wetlands, which existed in the flood plains of the major river basins of these states, i.e., Chenab, Jhelum, Satluj, Beas, Ravi, and Ghagar, in the beginning of the 20th century. To achieve the objective, the study has used multi-temporal datasets since the 1960s using high to medium resolution satellite datasets, e.g., Corona (1960s/70s), Landsat (1990s-2017) and Sentinel (2017). The Sentinel (2017) satellite image has been used for making the wetland inventory owing to its comparatively higher spatial resolution with multi-spectral bands. In addition, historical records, repeated photographs, historical maps, field observations including geomorphological evidence were also used. The water index techniques, i.e., band rationing, normalized difference water index (NDWI), modified NDWI (MNDWI) have been compared and used to map the wetlands. The wetland types found in the north-western states have been categorized under 19 classes suggested by Space Application Centre, India. These enable the researcher to provide with the wetlands inventory and a series of cartographic representation that includes overlaying multiple temporal wetlands extent vectors. A preliminary result shows the general state of wetland shrinkage since the 1960s with varying area shrinkage rate from one wetland to another. In addition, it is observed that majority of wetlands have not been documented so far and even do not have names. Moreover, the purpose is to emphasize their elimination in addition to establishing a baseline dataset that can be a tool for wetland planning and management. Finally, the applicability of cartographic depiction and visualization, historical map sources, repeated photographs and remote sensing data for reconstruction of long term wetlands fluctuations, especially in the northern part of India, will be addressed.Keywords: cartographic depiction and visualization, wetland changes, NDWI/MDWI, geomorphological evidence and remote sensing
Procedia PDF Downloads 26925541 An Exhaustive All-Subsets Examination of Trade Theory on WTO Data
Authors: Masoud Charkhabi
Abstract:
We examine trade theory with this motivation. The full set of World Trade Organization data are organized into country-year pairs, each treated as a different entity. Topological Data Analysis reveals that among the 16 region and 240 region-year pairs there exists in fact a distinguishable group of region-period pairs. The generally accepted periods of shifts from dissimilar-dissimilar to similar-similar trade in goods among regions are examined from this new perspective. The period breaks are treated as cumulative and are flexible. This type of all-subsets analysis is motivated from computer science and is made possible with Lossy Compression and Graph Theory. The results question many patterns in similar-similar to dissimilar-dissimilar trade. They also show indications of economic shifts that only later become evident in other economic metrics.Keywords: econometrics, globalization, network science, topological data, analysis, trade theory, visualization, world trade
Procedia PDF Downloads 37925540 Comparison of the Glidescope Visualization and Neck Flexion with Lateral Neck Pressure Nasogastric Tube Insertion Techniques in Anaesthetized Patients: A Prospective Randomized Clinical Study
Authors: Pitchaporn Purngpiputtrakul, Suttasinee Petsakul, Sunisa Chatmongkolchart
Abstract:
Nasogastric tube (NGT) insertion in anaesthetized and intubated patients can be challenging even for experienced anesthesiologists. Various techniques have been proposed to facilitate NGT insertion in these patients. This study aimed to compare the success rate and time required for NGT insertion between the GlideScope visualization and neck flexion with lateral neck pressure techniques. This randomized clinical trial was performed at a teaching hospital on 86 adult patients undergoing abdominal surgery under relaxant general anaesthesia who required intraoperative NGT insertion. The patients were randomized into two groups, the GlideScope group (group G) and the neck flexion with lateral neck pressure group (group F). The success rate of first and second attempts, duration of insertion, and complications were recorded. The total success rate was 79.1% in Group G compared with 76.7% in Group F (P=1) The median time required for NGT insertion was significantly longer in Group G, for both first and second attempts (97 vs 42 seconds P<0.001) and (70 vs 48.5 seconds P=0.015), respectively. Complications were reported in 23 patients (53.5%) in group G and 13 patients (30.2%) in group F. Bleeding and kinking were the most common complications in both techniques. Using GlideScope visualization to facilitate NGT insertion was comparable to neck flexion with lateral neck pressure technique in degree of success rate of insertion, while neck flexion with lateral neck pressure technique had fewer complications and was less time-consuming.Keywords: anaesthesia, nasogastric tube, GlideScope, intubation
Procedia PDF Downloads 17025539 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example
Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang
Abstract:
Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.Keywords: cancer, visualization, database, functional annotation
Procedia PDF Downloads 62325538 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems
Authors: Z. Bouattou, R. Laurini, H. Belbachir
Abstract:
This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems
Procedia PDF Downloads 40525537 Quantitative Characterization of Single Orifice Hydraulic Flat Spray Nozzle
Authors: Y. C. Khoo, W. T. Lai
Abstract:
The single orifice hydraulic flat spray nozzle was evaluated with two global imaging techniques to characterize various aspects of the resulting spray. The two techniques were high resolution flow visualization and Particle Image Velocimetry (PIV). A CCD camera with 29 million pixels was used to capture shadowgraph images to realize ligament formation and collapse as well as droplet interaction. Quantitative analysis was performed to give the sizing information of the droplets and ligaments. This camera was then applied with a PIV system to evaluate the overall velocity field of the spray, from nozzle exit to droplet discharge. PIV images were further post-processed to determine the inclusion angle of the spray. The results from those investigations provided significant quantitative understanding of the spray structure. Based on the quantitative results, detailed understanding of the spray behavior was achieved.Keywords: spray, flow visualization, PIV, shadowgraph, quantitative sizing, velocity field
Procedia PDF Downloads 38725536 Can 3D Virtual Prototyping Conquers the Apparel Industry?
Authors: Evridiki Papachristou, Nikolaos Bilalis
Abstract:
Imagine an apparel industry where fashion design does not begin with a paper-and-pen drawing which is then translated into pattern and later to a 3D model where the designer tries out different fabrics, colours and contrasts. Instead, imagine a fashion designer in the future who produces that initial fashion drawing in a three-dimensional space and won’t leave that environment until the product is done, communicating his/her ideas with the entire development team in true to life 3D. Three-dimensional (3D) technology - while well established in many other industrial sectors like automotive, aerospace, architecture and industrial design, has only just started to open up a whole range of new opportunities for apparel designers. The paper will discuss the process of 3D simulation technology enhanced by high quality visualization of data and its capability to ensure a massive competitiveness in the market. Secondly, it will underline the most frequent problems & challenges that occur in the process chain when various partners in the production of textiles and apparel are working together. Finally, it will offer a perspective of how the Virtual Prototyping Technology will make the global textile and apparel industry change to a level where designs will be visualized on a computer and various scenarios modeled without even having to produce a physical prototype. This state-of-the-art 3D technology has been described as transformative and“disruptive”comparing to the process of the way apparel companies develop their fashion products today. It provides the benefit of virtual sampling not only for quick testing of design ideas, but also reducing process steps and having more visibility.A so called“digital asset” that can be used for other purposes such as merchandising or marketing.Keywords: 3D visualization, apparel, virtual prototyping, prototyping technology
Procedia PDF Downloads 59525535 GeneNet: Temporal Graph Data Visualization for Gene Nomenclature and Relationships
Authors: Jake Gonzalez, Tommy Dang
Abstract:
This paper proposes a temporal graph approach to visualize and analyze the evolution of gene relationships and nomenclature over time. An interactive web-based tool implements this temporal graph, enabling researchers to traverse a timeline and observe coupled dynamics in network topology and naming conventions. Analysis of a real human genomic dataset reveals the emergence of densely interconnected functional modules over time, representing groups of genes involved in key biological processes. For example, the antimicrobial peptide DEFA1A3 shows increased connections to related alpha-defensins involved in infection response. Tracking degree and betweenness centrality shifts over timeline iterations also quantitatively highlight the reprioritization of certain genes’ topological importance as knowledge advances. Examination of the CNR1 gene encoding the cannabinoid receptor CB1 demonstrates changing synonymous relationships and consolidating naming patterns over time, reflecting its unique functional role discovery. The integrated framework interconnecting these topological and nomenclature dynamics provides richer contextual insights compared to isolated analysis methods. Overall, this temporal graph approach enables a more holistic study of knowledge evolution to elucidate complex biology.Keywords: temporal graph, gene relationships, nomenclature evolution, interactive visualization, biological insights
Procedia PDF Downloads 6825534 Applying Hybrid Graph Drawing and Clustering Methods on Stock Investment Analysis
Authors: Mouataz Zreika, Maria Estela Varua
Abstract:
Stock investment decisions are often made based on current events of the global economy and the analysis of historical data. Conversely, visual representation could assist investors’ gain deeper understanding and better insight on stock market trends more efficiently. The trend analysis is based on long-term data collection. The study adopts a hybrid method that combines the Clustering algorithm and Force-directed algorithm to overcome the scalability problem when visualizing large data. This method exemplifies the potential relationships between each stock, as well as determining the degree of strength and connectivity, which will provide investors another understanding of the stock relationship for reference. Information derived from visualization will also help them make an informed decision. The results of the experiments show that the proposed method is able to produced visualized data aesthetically by providing clearer views for connectivity and edge weights.Keywords: clustering, force-directed, graph drawing, stock investment analysis
Procedia PDF Downloads 30425533 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life
Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr
Abstract:
Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology
Procedia PDF Downloads 29925532 Mining User-Generated Contents to Detect Service Failures with Topic Model
Authors: Kyung Bae Park, Sung Ho Ha
Abstract:
Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.Keywords: latent dirichlet allocation, R program, text mining, topic model, user generated contents, visualization
Procedia PDF Downloads 18925531 Still Pictures for Learning Foreign Language Sounds
Authors: Kaoru Tomita
Abstract:
This study explores how visual information helps us to learn foreign language pronunciation. Visual assistance and its effect for learning foreign language have been discussed widely. For example, simplified illustrations in textbooks are used for telling learners which part of the articulation organs are used for pronouncing sounds. Vowels are put into a chart that depicts a vowel space. Consonants are put into a table that contains two axes of place and manner of articulation. When comparing a still picture and a moving picture for visualizing learners’ pronunciation, it becomes clear that the former works better than the latter. The visualization of vowels was applied to class activities in which native and non-native speakers’ English was compared and the learners’ feedback was collected: the positions of six vowels did not scatter as much as they were expected to do. Specifically, two vowels were not discriminated and were arranged very close in the vowel space. It was surprising for the author to find that learners liked analyzing their own pronunciation by linking formant ones and twos on a sheet of paper with a pencil. Even a simple method works well if it leads learners to think about their pronunciation analytically.Keywords: feedback, pronunciation, visualization, vowel
Procedia PDF Downloads 25525530 A Small Graphic Lie. The Photographic Quality of Pierre Bourdieu’s Correspondance Analysis
Authors: Lene Granzau Juel-Jacobsen
Abstract:
The problem of beautification is an obvious concern of photography, claiming reference to reality, but it also lies at the very heart of social theory. As we become accustomed to sophisticated visualizations of statistical data in pace with the development of software programs, we should not only be inclined to ask new types of research questions, but we also need to confront social theories based on such visualization techniques with new types of questions. Correspondence Analysis, GIS analysis, Social Network Analysis, and Perceptual Maps are current examples of visualization techniques popular within the social sciences and neighboring disciplines. This article discusses correspondence analysis, arguing that the graphic plot of correspondence analysis is to be interpreted much similarly to a photograph. It refers no more evidently or univocally to reality than a photograph, representing social life no more truthfully than a photograph documents. Pierre Bourdieu’s theoretical corpus, especially his theory of fields, relies heavily on correspondence analysis. While much attention has been directed towards critiquing the somewhat vague conceptualization of habitus, limited focus has been placed on the equally problematic concepts of social space and field. Based on a re-reading of the Distinction, the article argues that the concepts rely on ‘a small graphic lie’ very similar to a photograph. Like any other piece of art, as Bourdieu himself recognized, the graphic display is a politically and morally loaded representation technique. However, the correspondence analysis does not necessarily serve the purpose he intended. In fact, it tends towards the pitfalls he strove to overcome.Keywords: datavisualization, correspondance analysis, bourdieu, Field, visual representation
Procedia PDF Downloads 7125529 Legal Judgment Prediction through Indictments via Data Visualization in Chinese
Authors: Kuo-Chun Chien, Chia-Hui Chang, Ren-Der Sun
Abstract:
Legal Judgment Prediction (LJP) is a subtask for legal AI. Its main purpose is to use the facts of a case to predict the judgment result. In Taiwan's criminal procedure, when prosecutors complete the investigation of the case, they will decide whether to prosecute the suspect and which article of criminal law should be used based on the facts and evidence of the case. In this study, we collected 305,240 indictments from the public inquiry system of the procuratorate of the Ministry of Justice, which included 169 charges and 317 articles from 21 laws. We take the crime facts in the indictments as the main input to jointly learn the prediction model for law source, article, and charge simultaneously based on the pre-trained Bert model. For single article cases where the frequency of the charge and article are greater than 50, the prediction performance of law sources, articles, and charges reach 97.66, 92.22, and 60.52 macro-f1, respectively. To understand the big performance gap between articles and charges, we used a bipartite graph to visualize the relationship between the articles and charges, and found that the reason for the poor prediction performance was actually due to the wording precision. Some charges use the simplest words, while others may include the perpetrator or the result to make the charges more specific. For example, Article 284 of the Criminal Law may be indicted as “negligent injury”, "negligent death”, "business injury", "driving business injury", or "non-driving business injury". As another example, Article 10 of the Drug Hazard Control Regulations can be charged as “Drug Control Regulations” or “Drug Hazard Control Regulations”. In order to solve the above problems and more accurately predict the article and charge, we plan to include the article content or charge names in the input, and use the sentence-pair classification method for question-answer problems in the BERT model to improve the performance. We will also consider a sequence-to-sequence approach to charge prediction.Keywords: legal judgment prediction, deep learning, natural language processing, BERT, data visualization
Procedia PDF Downloads 12525528 Geovisualisation for Defense Based on a Deep Learning Monocular Depth Reconstruction Approach
Authors: Daniel R. dos Santos, Mateus S. Maldonado, Estevão J. R. Batista
Abstract:
The military commanders increasingly dependent on spatial awareness, as knowing where enemy are, understanding how war battle scenarios change over time, and visualizing these trends in ways that offer insights for decision-making. Thanks to advancements in geospatial technologies and artificial intelligence algorithms, the commanders are now able to modernize military operations on a universal scale. Thus, geovisualisation has become an essential asset in the defense sector. It has become indispensable for better decisionmaking in dynamic/temporal scenarios, operation planning and management for the war field, situational awareness, effective planning, monitoring, and others. For example, a 3D visualization of war field data contributes to intelligence analysis, evaluation of postmission outcomes, and creation of predictive models to enhance decision-making and strategic planning capabilities. However, old-school visualization methods are slow, expensive, and unscalable. Despite modern technologies in generating 3D point clouds, such as LIDAR and stereo sensors, monocular depth values based on deep learning can offer a faster and more detailed view of the environment, transforming single images into visual information for valuable insights. We propose a dedicated monocular depth reconstruction approach via deep learning techniques for 3D geovisualisation of satellite images. It introduces scalability in terrain reconstruction and data visualization. First, a dataset with more than 7,000 satellite images and associated digital elevation model (DEM) is created. It is based on high resolution optical and radar imageries collected from Planet and Copernicus, on which we fuse highresolution topographic data obtained using technologies such as LiDAR and the associated geographic coordinates. Second, we developed an imagery-DEM fusion strategy that combine feature maps from two encoder-decoder networks. One network is trained with radar and optical bands, while the other is trained with DEM features to compute dense 3D depth. Finally, we constructed a benchmark with sparse depth annotations to facilitate future research. To demonstrate the proposed method's versatility, we evaluated its performance on no annotated satellite images and implemented an enclosed environment useful for Geovisualisation applications. The algorithms were developed in Python 3.0, employing open-source computing libraries, i.e., Open3D, TensorFlow, and Pythorch3D. The proposed method provides fast and accurate decision-making with GIS for localization of troops, position of the enemy, terrain and climate conditions. This analysis enhances situational consciousness, enabling commanders to fine-tune the strategies and distribute the resources proficiently.Keywords: depth, deep learning, geovisualisation, satellite images
Procedia PDF Downloads 1925527 An Overview of College English Writing Teaching Studies in China Between 2002 and 2022: Visualization Analysis Based on CiteSpace
Authors: Yang Yiting
Abstract:
This paper employs CiteSpace to conduct a visualiazation analysis of literature on college English writing teaching researches published in core journals from the CNKI database and CSSCI journals between 2002 and 2022. It aims to explore the characteristics of researches and future directions on college English writing teaching. The present study yielded the following major findings: the field primarily focuses on innovative writing teaching models and methods, the integration of traditional classroom teaching and information technology, and instructional strategies to enhance students' writing skills. The future research is anticipated to involve a hybrid writing teaching approach combining online and offline teaching methods, leveraging the "Internet+" digital platform, aiming to elevate students' writing proficiency. This paper also presents a prospective outlook for college English writing teaching research in China.Keywords: citespace, college English, writing teaching, visualization analysis
Procedia PDF Downloads 7625526 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring
Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie
Abstract:
Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement
Procedia PDF Downloads 1825525 Machine Learning-Based Workflow for the Analysis of Project Portfolio
Authors: Jean Marie Tshimula, Atsushi Togashi
Abstract:
We develop a data-science approach for providing an interactive visualization and predictive models to find insights into the projects' historical data in order for stakeholders understand some unseen opportunities in the African market that might escape them behind the online project portfolio of the African Development Bank. This machine learning-based web application identifies the market trend of the fastest growing economies across the continent as well skyrocketing sectors which have a significant impact on the future of business in Africa. Owing to this, the approach is tailored to predict where the investment needs are the most required. Moreover, we create a corpus that includes the descriptions of over more than 1,200 projects that approximately cover 14 sectors designed for some of 53 African countries. Then, we sift out this large amount of semi-structured data for extracting tiny details susceptible to contain some directions to follow. In the light of the foregoing, we have applied the combination of Latent Dirichlet Allocation and Random Forests at the level of the analysis module of our methodology to highlight the most relevant topics that investors may focus on for investing in Africa.Keywords: machine learning, topic modeling, natural language processing, big data
Procedia PDF Downloads 16925524 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh
Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila
Abstract:
Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.Keywords: data culture, data-driven organization, data mesh, data quality for business success
Procedia PDF Downloads 14025523 Interactive Glare Visualization Model for an Architectural Space
Authors: Florina Dutt, Subhajit Das, Matthew Swartz
Abstract:
Lighting design and its impact on indoor comfort conditions are an integral part of good interior design. Impact of lighting in an interior space is manifold and it involves many sub components like glare, color, tone, luminance, control, energy efficiency, flexibility etc. While other components have been researched and discussed multiple times, this paper discusses the research done to understand the glare component from an artificial lighting source in an indoor space. Consequently, the paper discusses a parametric model to convey real time glare level in an interior space to the designer/ architect. Our end users are architects and likewise for them it is of utmost importance to know what impression the proposed lighting arrangement and proposed furniture layout will have on indoor comfort quality. This involves specially those furniture elements (or surfaces) which strongly reflect light around the space. Essentially, the designer needs to know the ramification of the ‘discomfortable glare’ at the early stage of design cycle, when he still can afford to make changes to his proposed design and consider different routes of solution for his client. Unfortunately, most of the lighting analysis tools that are present, offer rigorous computation and analysis on the back end eventually making it challenging for the designer to analyze and know the glare from interior light quickly. Moreover, many of them do not focus on glare aspect of the artificial light. That is why, in this paper, we explain a novel approach to approximate interior glare data. Adding to that we visualize this data in a color coded format, expressing the implications of their proposed interior design layout. We focus on making this analysis process very fluid and fast computationally, enabling complete user interaction with the capability to vary different ranges of user inputs adding more degrees of freedom for the user. We test our proposed parametric model on a case study, a Computer Lab space in our college facility.Keywords: computational geometry, glare impact in interior space, info visualization, parametric lighting analysis
Procedia PDF Downloads 35325522 Meta-Review of Scholarly Publications on Biosensors: A Bibliometric Study
Authors: Nasrine Olson
Abstract:
With over 70,000 scholarly publications on the topic of biosensors, an overview of the field has become a challenge. To facilitate, there are currently over 700 expert-reviews of publications on biosensors and related topics. This study focuses on these review papers in order to provide a Meta-Review of the area. This paper provides a statistical analysis and overview of biosensor-related review papers. Comprehensive searches are conducted in the Web of Science, and PubMed databases and the resulting empirical material are analyzed using bibliometric methods and tools. The study finds that the biosensor-related review papers can be categorized in five related subgroups, broadly denoted by (i) properties of materials and particles, (ii) analysis and indicators, (iii) diagnostics, (iv) pollutant and analytical devices, and (v) treatment/ application. For an easy and clear access to the findings visualization of clusters and networks of connections are presented. The study includes a temporal dimension and identifies the trends over the years with an emphasis on the most recent developments. This paper provides useful insights for those who wish to form a better understanding of the research trends in the area of biosensors.Keywords: bibliometrics, biosensors, meta-review, statistical analysis, trends visualization
Procedia PDF Downloads 22225521 Annexing the Strength of Information and Communication Technology (ICT) for Real-time TB Reporting Using TB Situation Room (TSR) in Nigeria: Kano State Experience
Authors: Ibrahim Umar, Ashiru Rajab, Sumayya Chindo, Emmanuel Olashore
Abstract:
INTRODUCTION: Kano is the most populous state in Nigeria and one of the two states with the highest TB burden in the country. The state notifies an average of 8,000+ TB cases quarterly and has the highest yearly notification of all the states in Nigeria from 2020 to 2022. The contribution of the state TB program to the National TB notification varies from 9% to 10% quarterly between the first quarter of 2022 and second quarter of 2023. The Kano State TB Situation Room is an innovative platform for timely data collection, collation and analysis for informed decision in health system. During the 2023 second National TB Testing week (NTBTW) Kano TB program aimed at early TB detection, prevention and treatment. The state TB Situation room provided avenue to the state for coordination and surveillance through real time data reporting, review, analysis and use during the NTBTW. OBJECTIVES: To assess the role of innovative information and communication technology platform for real-time TB reporting during second National TB Testing week in Nigeria 2023. To showcase the NTBTW data cascade analysis using TSR as innovative ICT platform. METHODOLOGY: The State TB deployed a real-time virtual dashboard for NTBTW reporting, analysis and feedback. A data room team was set up who received realtime data using google link. Data received was analyzed using power BI analytic tool with statistical alpha level of significance of <0.05. RESULTS: At the end of the week-long activity and using the real-time dashboard with onsite mentorship of the field workers, the state TB program was able to screen a total of 52,054 people were screened for TB from 72,112 individuals eligible for screening (72% screening rate). A total of 9,910 presumptive TB clients were identified and evaluated for TB leading to diagnosis of 445 TB patients with TB (5% yield from presumptives) and placement of 435 TB patients on treatment (98% percentage enrolment). CONCLUSION: The TB Situation Room (TBSR) has been a great asset to Kano State TB Control Program in meeting up with the growing demand for timely data reporting in TB and other global health responses. The use of real time surveillance data during the 2023 NTBTW has in no small measure improved the TB response and feedback in Kano State. Scaling up this intervention to other disease areas, states and nations is a positive step in the right direction towards global TB eradication.Keywords: tuberculosis (tb), national tb testing week (ntbtw), tb situation rom (tsr), information communication technology (ict)
Procedia PDF Downloads 7925520 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30525519 Flow Visualization around a Rotationally Oscillating Cylinder
Authors: Cemre Polat, Mustafa Soyler, Bulent Yaniktepe, Coskun Ozalp
Abstract:
In this study, it was aimed to control the flow actively by giving an oscillating rotational motion to a vertically placed cylinder, and flow characteristics were determined. In the study, firstly, the flow structure around the flat cylinder was investigated with dye experiments, and then the cylinders with different oscillation angles (θ = 60°, θ = 120°, and θ = 180°) and different rotation speeds (15 rpm and 30 rpm) the flow structure around it was examined. Thus, the effectiveness of oscillation and rotation speed in flow control has been investigated. In the dye experiments, the dye/water mixture obtained by mixing Rhodamine 6G in powder form with water, which shines under laser light and allows detailed observation of the flow structure, was used. During the experiments, the dye was injected into the flow with the help of a thin needle at a distance that would not affect the flow from the front of the cylinder. In dye experiments, 100 frames per second were taken with a Canon brand EOS M50 (24MP) digital mirrorless camera at a resolution of 1280 * 720 pixels. Then, the images taken were analyzed, and the pictures representing the flow structure for each experiment were obtained. As a result of the study, it was observed that no separation points were formed at 180° swing angle at 15 rpm speed, 120° and 180° swing angle at 30 rpm, and the flow was controlled according to the fixed cylinder.Keywords: active flow control, cylinder, flow visualization rotationally oscillating
Procedia PDF Downloads 18025518 The Role of Predictive Modeling and Optimization in Enhancing Smart Factory Efficiency
Authors: Slawomir Lasota, Tomasz Kajdanowicz
Abstract:
This research examines the application of predictive modelling and optimization algorithms to improve production efficiency in smart factories. Utilizing gradient boosting and neural networks, the study builds robust KPI estimators to predict production outcomes based on real-time data. Optimization methods, including Bayesian optimization and gradient-based algorithms, identify optimal process configurations that maximize availability, efficiency, and quality KPIs. The paper highlights the modular architecture of a recommender system that integrates predictive models, data visualization, and adaptive automation. Comparative analysis across multiple production processes reveals significant improvements in operational performance, laying the foundation for scalable, self-regulating manufacturing systems.Keywords: predictive modeling, optimization, smart factory, efficiency
Procedia PDF Downloads 1725517 Synthesis of Highly Stable Near-Infrared FAPbI₃ Perovskite Doped with 5-AVA and Its Applications in NIR Light-Emitting Diodes for Bioimaging
Authors: Nasrud Din, Fawad Saeed, Sajid Hussain, Rai Muhammad Dawood Sultan, Premkumar Sellan, Qasim Khan, Wei Lei
Abstract:
The continuously increasing external quantum efficiencies of Perovskite light-emitting diodes (LEDs) have received significant interest in the scientific community. The need for monitoring and medical diagnostics has experienced a steady growth in recent years, primarily caused by older people and an increasing number of heart attacks, tumors, and cancer disorders among patients. The application of Perovskite near-infrared light-emitting diode (PeNIRLEDs) has exhibited considerable efficacy in bioimaging, particularly in the visualization and examination of blood arteries, blood clots, and tumors. PeNIRLEDs exhibit exciting potential in the field of blood vessel imaging because of their advantageous attributes, including improved depth penetration and less scattering in comparison to visible light. In this study, we synthesized FAPbI₃ Perovskite doped with different concentrations of 5-Aminovaleric acid (5-AVA) 1-6 mg. The incorporation of 5-AVA as a dopant during the FAPbI₃ Perovskite formation influences the FAPbI3 Perovskite’s structural and optical properties, improving its stability, photoluminescence efficiency, and charge transport characteristics. We found a resulting PL emission peak wavelength of 850 nm and bandwidth of 44 nm, along with a calculated quantum yield of 75%. The incorporation of 5-AVA-modified FAPbI₃ Perovskite into LEDs will show promising results, enhancing device efficiency, color purity, and stability. Making it suitable for various medical applications, including subcutaneous deep vein imaging, blood flow visualization, and tumor illumination.Keywords: perovskite light-emitting diodes, deep vein imaging, blood flow visualization, tumor illumination
Procedia PDF Downloads 6425516 Data Management System for Environmental Remediation
Authors: Elizaveta Petelina, Anton Sizo
Abstract:
Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.Keywords: data management, environmental remediation, geographic information system, GIS, decision making
Procedia PDF Downloads 16625515 Geospatial Network Analysis Using Particle Swarm Optimization
Authors: Varun Singh, Mainak Bandyopadhyay, Maharana Pratap Singh
Abstract:
The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.Keywords: particle swarm optimization, GIS, traffic data, outliers
Procedia PDF Downloads 490