Search results for: real-time data visualization
25454 The Impact of Introspective Models on Software Engineering
Authors: Rajneekant Bachan, Dhanush Vijay
Abstract:
The visualization of operating systems has refined the Turing machine, and current trends suggest that the emulation of 32 bit architectures will soon emerge. After years of technical research into Web services, we demonstrate the synthesis of gigabit switches, which embodies the robust principles of theory. Loam, our new algorithm for forward-error correction, is the solution to all of these challenges.Keywords: software engineering, architectures, introspective models, operating systems
Procedia PDF Downloads 54125453 Speed Breaker/Pothole Detection Using Hidden Markov Models: A Deep Learning Approach
Authors: Surajit Chakrabarty, Piyush Chauhan, Subhasis Panda, Sujoy Bhattacharya
Abstract:
A large proportion of roads in India are not well maintained as per the laid down public safety guidelines leading to loss of direction control and fatal accidents. We propose a technique to detect speed breakers and potholes using mobile sensor data captured from multiple vehicles and provide a profile of the road. This would, in turn, help in monitoring roads and revolutionize digital maps. Incorporating randomness in the model formulation for detection of speed breakers and potholes is crucial due to substantial heterogeneity observed in data obtained using a mobile application from multiple vehicles driven by different drivers. This is accomplished with Hidden Markov Models, whose hidden state sequence is found for each time step given the observables sequence, and are then fed as input to LSTM network with peephole connections. A precision score of 0.96 and 0.63 is obtained for classifying bumps and potholes, respectively, a significant improvement from the machine learning based models. Further visualization of bumps/potholes is done by converting time series to images using Markov Transition Fields where a significant demarcation among bump/potholes is observed.Keywords: deep learning, hidden Markov model, pothole, speed breaker
Procedia PDF Downloads 15225452 Discovering Event Outliers for Drug as Commercial Products
Authors: Arunas Burinskas, Aurelija Burinskiene
Abstract:
On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.Keywords: drugs, Grubbs' test, outlier, shortage event
Procedia PDF Downloads 13725451 A Rapid Prototyping Tool for Suspended Biofilm Growth Media
Authors: Erifyli Tsagkari, Stephanie Connelly, Zhaowei Liu, Andrew McBride, William Sloan
Abstract:
Biofilms play an essential role in treating water in biofiltration systems. The biofilm morphology and function are inextricably linked to the hydrodynamics of flow through a filter, and yet engineers rarely explicitly engineer this interaction. We develop a system that links computer simulation and 3-D printing to optimize and rapidly prototype filter media to optimize biofilm function with the hypothesis that biofilm function is intimately linked to the flow passing through the filter. A computational model that numerically solves the incompressible time-dependent Navier Stokes equations coupled to a model for biofilm growth and function is developed. The model is imbedded in an optimization algorithm that allows the model domain to adapt until criteria on biofilm functioning are met. This is applied to optimize the shape of filter media in a simple flow channel to promote biofilm formation. The computer code links directly to a 3-D printer, and this allows us to prototype the design rapidly. Its validity is tested in flow visualization experiments and by microscopy. As proof of concept, the code was constrained to explore a small range of potential filter media, where the medium acts as an obstacle in the flow that sheds a von Karman vortex street that was found to enhance the deposition of bacteria on surfaces downstream. The flow visualization and microscopy in the 3-D printed realization of the flow channel validated the predictions of the model and hence its potential as a design tool. Overall, it is shown that the combination of our computational model and the 3-D printing can be effectively used as a design tool to prototype filter media to optimize biofilm formation.Keywords: biofilm, biofilter, computational model, von karman vortices, 3-D printing.
Procedia PDF Downloads 14625450 Effectiveness and Efficiency of Unified Philippines Accident Reporting and Database System in Optimizing Road Crash Data Usage with Various Stakeholders
Authors: Farhad Arian Far, Anjanette Q. Eleazar, Francis Aldrine A. Uy, Mary Joyce Anne V. Uy
Abstract:
The Unified Philippine Accident Reporting and Database System (UPARDS), is a newly developed system by Dr. Francis Aldrine Uy of the Mapua Institute of Technology. The main purpose is to provide an advanced road accident investigation tool, record keeping and analysis system for stakeholders such as Philippine National Police (PNP), Metro Manila Development Authority (MMDA), Department of Public Works and Highways (DPWH), Department of Health (DOH), and insurance companies. The system is composed of 2 components, the mobile application for road accident investigators that takes advantage of available technology to advance data gathering and the web application that integrates all accident data for the use of all stakeholders. The researchers with the cooperation of PNP’s Vehicle Traffic Investigation Sector of the City of Manila, conducted the field-testing of the application in fifteen (15) accident cases. Simultaneously, the researchers also distributed surveys to PNP, Manila Doctors Hospital, and Charter Ping An Insurance Company to gather their insights regarding the web application. The survey was designed on information systems theory called Technology Acceptance Model. The results of the surveys revealed that the respondents were greatly satisfied with the visualization and functions of the applications as it proved to be effective and far more efficient in comparison with the conventional pen-and-paper method. In conclusion, the pilot study was able to address the need for improvement of the current system.Keywords: accident, database, investigation, mobile application, pilot testing
Procedia PDF Downloads 44725449 PRISM: An Analytical Tool for Forest Plan Development
Authors: Dung Nguyen, Yu Wei, Eric Henderson
Abstract:
Analytical tools have been used for decades to assist in the development of forest plans. In 2016, a new decision support system, PRISM, was jointly developed by United States Forest Service (USFS) Northern Region and Colorado State University to support the forest planning process. Prism has a friendly user interface with functionality for database management, model development, data visualization, and sensitivity analysis. The software is tailored for USFS planning, but it is flexible enough to support planning efforts by other forestland owners and managers. Here, the core capability of PRISM and its applications in developing plans for several United States national forests are presented. The strengths of PRISM are also discussed to show its potential of being a preferable tool for managers and experts in the domain of forest management and planning.Keywords: decision support, forest management, forest plan, graphical user interface, software
Procedia PDF Downloads 11425448 A Wearable Fluorescence Imaging Device for Intraoperative Identification of Human Brain Tumors
Authors: Guoqiang Yu, Mehrana Mohtasebi, Jinghong Sun, Thomas Pittman
Abstract:
Malignant glioma (MG) is the most common type of primary malignant brain tumor. Surgical resection of MG remains the cornerstone of therapy, and the extent of resection correlates with patient survival. A limiting factor for resection, however, is the difficulty in differentiating the tumor from normal tissue during surgery. Fluorescence imaging is an emerging technique for real-time intraoperative visualization of MGs and their boundaries. However, most clinical-grade neurosurgical operative microscopes with fluorescence imaging ability are hampered by low adoption rates due to high cost, limited portability, limited operation flexibility, and lack of skilled professionals with technical knowledge. To overcome the limitations, we innovatively integrated miniaturized light sources, flippable filters, and a recording camera to the surgical eye loupes to generate a wearable fluorescence eye loupe (FLoupe) device for intraoperative imaging of fluorescent MGs. Two FLoupe prototypes were constructed for imaging of Fluorescein and 5-aminolevulinic acid (5-ALA), respectively. The wearable FLoupe devices were tested on tumor-simulating phantoms and patients with MGs. Comparable results were observed against the standard neurosurgical operative microscope (PENTERO® 900) with fluorescence kits. The affordable and wearable FLoupe devices enable visualization of both color and fluorescence images with the same quality as the large and expensive stationary operative microscopes. The wearable FLoupe device allows for a greater range of movement, less obstruction, and faster/easier operation. Thus, it reduces surgery time and is more easily adapted to the surgical environment than unwieldy neurosurgical operative microscopes.Keywords: fluorescence guided surgery, malignant glioma, neurosurgical operative microscope, wearable fluorescence imaging device
Procedia PDF Downloads 6825447 Applications of Big Data in Education
Authors: Faisal Kalota
Abstract:
Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners’ needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in education. Additionally, it discusses some of the concerns related to Big Data and current research trends. While Big Data can provide big benefits, it is important that institutions understand their own needs, infrastructure, resources, and limitation before jumping on the Big Data bandwagon.Keywords: big data, learning analytics, analytics, big data in education, Hadoop
Procedia PDF Downloads 43225446 Virtual Reality Technology for Employee Training in High-Risk Industries: Benefits and Advancements
Authors: Yeganeh Jabbari, Sepideh Khalatabad
Abstract:
This study explores the development of virtual reality (VR) technology for training applications, specifically its the potential benefits of VR technology for employee training and its ability to simulate real-world scenarios in a safe and controlled environment are highlighted, along with the associated cost and time savings. The adoption of VR technology in high-risk industrial organizations such as the oil and gas industry is discussed, with a focus on its ability to improve worker performance. Additionally, the use of VR technology in activities such as simulation and data visualization in the oil and gas industry is explored, leading to enhanced safety measures and collaboration between teams. The integration of advanced technologies such as robotics is mentioned as a way to further promote efficiency and sustainability. Also, the study mentions that the digital transformation of the oil and gas industry is revolutionizing operations and promoting safety, efficiency, and sustainability through the use of VR technology.Keywords: virtual reality training, virtual reality benefits, high-risk industries, digital transformation
Procedia PDF Downloads 9325445 Visual Aid and Imagery Ramification on Decision Making: An Exploratory Study Applicable in Emergency Situations
Authors: Priyanka Bharti
Abstract:
Decades ago designs were based on common sense and tradition, but after an enhancement in visualization technology and research, we are now able to comprehend the cognitive ability involved in the decoding of the visual information. However, many fields in visuals need intense research to deliver an efficient explanation for the events. Visuals are an information representation mode through images, symbols and graphics. It plays an impactful role in decision making by facilitating quick recognition, comprehension, and analysis of a situation. They enhance problem-solving capabilities by enabling the processing of more data without overloading the decision maker. As research proves that, visuals offer an improved learning environment by a factor of 400 compared to textual information. Visual information engages learners at a cognitive level and triggers the imagination, which enables the user to process the information faster (visuals are processed 60,000 times faster in the brain than text). Appropriate information, visualization, and its presentation are known to aid and intensify the decision-making process for the users. However, most literature discusses the role of visual aids in comprehension and decision making during normal conditions alone. Unlike emergencies, in a normal situation (e.g. our day to day life) users are neither exposed to stringent time constraints nor face the anxiety of survival and have sufficient time to evaluate various alternatives before making any decision. An emergency is an unexpected probably fatal real-life situation which may inflict serious ramifications on both human life and material possessions unless corrective measures are taken instantly. The situation demands the exposed user to negotiate in a dynamic and unstable scenario in the absence or lack of any preparation, but still, take swift and appropriate decisions to save life/lives or possessions. But the resulting stress and anxiety restricts cue sampling, decreases vigilance, reduces the capacity of working memory, causes premature closure in evaluating alternative options, and results in task shedding. Limited time, uncertainty, high stakes and vague goals negatively affect cognitive abilities to take appropriate decisions. More so, theory of natural decision making by experts has been understood with far more depth than that of an ordinary user. Therefore, in this study, the author aims to understand the role of visual aids in supporting rapid comprehension to take appropriate decisions during an emergency situation.Keywords: cognition, visual, decision making, graphics, recognition
Procedia PDF Downloads 27225444 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics
Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee
Abstract:
Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru
Procedia PDF Downloads 9425443 Computer Network Applications, Practical Implementations and Structural Control System Representations
Authors: El Miloudi Djelloul
Abstract:
The computer network play an important position for practical implementations of the differently system. To implement a system into network above all is needed to know all the configurations, which is responsible to be a part of the system, and to give adequate information and solution in realtime. So if want to implement this system for example in the school or relevant institutions, the first step is to analyze the types of model which is needed to be configured and another important step is to organize the works in the context of devices, as a part of the general system. Often before configuration, as important point is descriptions and documentations from all the works into the respective process, and then to organize in the aspect of problem-solving. The computer network as critic infrastructure is very specific so the paper present the effectiveness solutions in the structured aspect viewed from one side, and another side is, than the paper reflect the positive aspect in the context of modeling and block schema presentations as an better alternative to solve the specific problem because of continually distortions of the system from the line of devices, programs and signals or packed collisions, which are in movement from one computer node to another nodes.Keywords: local area networks, LANs, block schema presentations, computer network system, computer node, critical infrastructure packed collisions, structural control system representations, computer network, implementations, modeling structural representations, companies, computers, context, control systems, internet, software
Procedia PDF Downloads 36825442 Behavioral Finance in Hundred Keywords
Authors: Ramon Hernán, Maria Teresa Corzo
Abstract:
This study examines the impact and contribution of the main journals in the discipline of behavioral finance to determine the state of the art of the discipline and the growth lines and concepts studied to date. This is a unique and novel study given that a review of the discipline has not been carried out through the keywords of the articles that allows visualizing through this component of the research, which are the main topics of discussion and the relationships that arise between the concepts discussed. To carry out this study, 3,876 articles have been taken as a reference, which includes 15,859 keywords from the main journals responsible for the growth of the discipline.; Journal of Behavioral Finance, Review of Behavioral Finance, Journal of Behavioral and Experimental Economics, Journal of Behavioral and Experimental Economics and Review of Behavioral Finance. The results indicate which are the topics most covered in the discipline throughout the period from 2000 to 2020, how these concepts have been dealt with on a recurring basis along with others throughout the aforementioned period and how the different concepts have been grouped based on the keywords established by the authors for the classification of their articles with a network diagram to complete the analysis.Keywords: behavioral finance, keywords, co-words, top journals, data visualization
Procedia PDF Downloads 19725441 FlameCens: Visualization of Expressive Deviations in Music Performance
Authors: Y. Trantafyllou, C. Alexandraki
Abstract:
Music interpretation accounts to the way musicians shape their performance by deliberately deviating from composers’ intentions, which are commonly communicated via some form of music transcription, such as a music score. For transcribed and non-improvised music, music expression is manifested by introducing subtle deviations in tempo, dynamics and articulation during the evolution of performance. This paper presents an application, named FlameCens, which, given two recordings of the same piece of music, presumably performed by different musicians, allow visualising deviations in tempo and dynamics during playback. The application may also compare a certain performance to the music score of that piece (i.e. MIDI file), which may be thought of as an expression-neutral representation of that piece, hence depicting the expressive queues employed by certain performers. FlameCens uses the Dynamic Time Warping algorithm to compare two audio sequences, based on CENS (Chroma Energy distribution Normalized Statistics) audio features. Expressive deviations are illustrated in a moving flame, which is generated by an animation of particles. The length of the flame is mapped to deviations in dynamics, while the slope of the flame is mapped to tempo deviations so that faster tempo changes the slope to the right and slower tempo changes the slope to the left. Constant slope signifies no tempo deviation. The detected deviations in tempo and dynamics can be additionally recorded in a text file, which allows for offline investigation. Moreover, in the case of monophonic music, the color of particles is used to convey the pitch of the notes during performance. FlameCens has been implemented in Python and it is openly available via GitHub. The application has been experimentally validated for different music genres including classical, contemporary, jazz and popular music. These experiments revealed that FlameCens can be a valuable tool for music specialists (i.e. musicians or musicologists) to investigate the expressive performance strategies employed by different musicians, as well as for music audience to enhance their listening experience.Keywords: audio synchronization, computational music analysis, expressive music performance, information visualization
Procedia PDF Downloads 13425440 Monomial Form Approach to Rectangular Surface Modeling
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications
Procedia PDF Downloads 14925439 Data Analysis Tool for Predicting Water Scarcity in Industry
Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse
Abstract:
Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.Keywords: data mining, industry, machine Learning, shortage, water resources
Procedia PDF Downloads 12625438 Analysis of Big Data
Authors: Sandeep Sharma, Sarabjit Singh
Abstract:
As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.Keywords: big data, unstructured data, volume, variety, velocity
Procedia PDF Downloads 55225437 Flow Visualization and Mixing Enhancement in Y-Junction Microchannel with 3D Acoustic Streaming Flow Patterns Induced by Trapezoidal Triangular Structure using High-Viscous Liquids
Authors: Ayalew Yimam Ali
Abstract:
The Y-shaped microchannel is used to mix both miscible or immiscible fluids with different viscosities. However, mixing at the entrance of the Y-junction microchannel can be a difficult mixing phenomena due to micro-scale laminar flow aspects with the two miscible high-viscosity water-glycerol fluids. One of the most promising methods to improve mixing performance and diffusion mass transfer in laminar flow phenomena is acoustic streaming (AS), which is a time-averaged, second-order steady streaming that can produce rolling motion in the microchannel by oscillating a low-frequency range acoustic transducer and inducing an acoustic wave in the flow field. The developed 3D trapezoidal, triangular structure spine used in this study was created using sophisticated CNC machine cutting tools used to create microchannel mold with a 3D trapezoidal triangular structure spine alone the Y-junction longitudinal mixing region. In order to create the molds for the 3D trapezoidal structure with the 3D sharp edge tip angles of 30° and 0.3mm trapezoidal triangular sharp edge tip depth from PMMA glass (Polymethylmethacrylate) with advanced CNC machine and the channel manufactured using PDMS (Polydimethylsiloxane) which is grown up longitudinally on top surface of the Y-junction microchannel using soft lithography nanofabrication strategies. Flow visualization of 3D rolling steady acoustic streaming and mixing enhancement with high-viscosity miscible fluids with different trapezoidal, triangular structure longitudinal length, channel width, high volume flow rate, oscillation frequency, and amplitude using micro-particle image velocimetry (μPIV) techniques were used to study the 3D acoustic streaming flow patterns and mixing enhancement. The streaming velocity fields and vorticity flow fields show 16 times more high vorticity maps than in the absence of acoustic streaming, and mixing performance has been evaluated at various amplitudes, flow rates, and frequencies using the grayscale value of pixel intensity with MATLAB software. Mixing experiments were performed using fluorescent green dye solution with de-ionized water in one inlet side of the channel, and the de-ionized water-glycerol mixture on the other inlet side of the Y-channel and degree of mixing was found to have greatly improved from 67.42% without acoustic streaming to 0.96.83% with acoustic streaming. The results show that the creation of a new 3D steady streaming rolling motion with a high volume flowrate around the entrance was enhanced by the formation of a new, three-dimensional, intense streaming rolling motion with a high-volume flowrate around the entrance junction mixing zone with the two miscible high-viscous fluids which are influenced by laminar flow fluid transport phenomena.Keywords: micro fabrication, 3d acoustic streaming flow visualization, micro-particle image velocimetry, mixing enhancement
Procedia PDF Downloads 2725436 Enhancement of Road Defect Detection Using First-Level Algorithm Based on Channel Shuffling and Multi-Scale Feature Fusion
Authors: Yifan Hou, Haibo Liu, Le Jiang, Wandong Su, Binqing Wang
Abstract:
Road defect detection is crucial for modern urban management and infrastructure maintenance. Traditional road defect detection methods mostly rely on manual labor, which is not only inefficient but also difficult to ensure their reliability. However, existing deep learning-based road defect detection models have poor detection performance in complex environments and lack robustness to multi-scale targets. To address this challenge, this paper proposes a distinct detection framework based on the one stage algorithm network structure. This article designs a deep feature extraction network based on RCSDarknet, which applies channel shuffling to enhance information fusion between tensors. Through repeated stacking of RCS modules, the information flow between different channels of adjacent layer features is enhanced to improve the model's ability to capture target spatial features. In addition, a multi-scale feature fusion mechanism with weighted dual flow paths was adopted to fuse spatial features of different scales, thereby further improving the detection performance of the model at different scales. To validate the performance of the proposed algorithm, we tested it using the RDD2022 dataset. The experimental results show that the enhancement algorithm achieved 84.14% mAP, which is 1.06% higher than the currently advanced YOLOv8 algorithm. Through visualization analysis of the results, it can also be seen that our proposed algorithm has good performance in detecting targets of different scales in complex scenes. The above experimental results demonstrate the effectiveness and superiority of the proposed algorithm, providing valuable insights for advancing real-time road defect detection methods.Keywords: roads, defect detection, visualization, deep learning
Procedia PDF Downloads 2125435 Transcriptomine: The Nuclear Receptor Signaling Transcriptome Database
Authors: Scott A. Ochsner, Christopher M. Watkins, Apollo McOwiti, David L. Steffen Lauren B. Becnel, Neil J. McKenna
Abstract:
Understanding signaling by nuclear receptors (NRs) requires an appreciation of their cognate ligand- and tissue-specific transcriptomes. While target gene regulation data are abundant in this field, they reside in hundreds of discrete publications in formats refractory to routine query and analysis and, accordingly, their full value to the NR signaling community has not been realized. One of the mandates of the Nuclear Receptor Signaling Atlas (NURSA) is to facilitate access of the community to existing public datasets. Pursuant to this mandate we are developing a freely-accessible community web resource, Transcriptomine, to bring together the sum total of available expression array and RNA-Seq data points generated by the field in a single location. Transcriptomine currently contains over 25,000,000 gene fold change datapoints from over 1200 contrasts relevant to over 100 NRs, ligands and coregulators in over 200 tissues and cell lines. Transcriptomine is designed to accommodate a spectrum of end users ranging from the bench researcher to those with advanced bioinformatic training. Visualization tools allow users to build custom charts to compare and contrast patterns of gene regulation across different tissues and in response to different ligands. Our resource affords an entirely new paradigm for leveraging gene expression data in the NR signaling field, empowering users to query gene fold changes across diverse regulatory molecules, tissues and cell lines, target genes, biological functions and disease associations, and that would otherwise be prohibitive in terms of time and effort. Transcriptomine will be regularly updated with gene lists from future genome-wide expression array and expression-sequencing datasets in the NR signaling field.Keywords: target gene database, informatics, gene expression, transcriptomics
Procedia PDF Downloads 27825434 A Multi-Role Oriented Collaboration Platform for Distributed Disaster Reduction in China
Authors: Linyao Qiu, Zhiqiang Du
Abstract:
As the rapid development of urbanization, economic developments, and steady population growth in China, the widespread devastation, economic damages, and loss of human lives caused by numerous forms of natural disasters are becoming increasingly serious every year. Disaster management requires available and effective cooperation of different roles and organizations in whole process including mitigation, preparedness, response and recovery. Due to the imbalance of regional development in China, the disaster management capabilities of national and provincial disaster reduction centers are uneven. When an undeveloped area suffers from disaster, neither local reduction department could get first-hand information like high-resolution remote sensing images from satellites and aircrafts independently, nor sharing mechanism is provided for the department to access to data resources deployed in other place directly. Most existing disaster management systems operate in a typical passive data-centric mode and work for single department, where resources cannot be fully shared. The impediment blocks local department and group from quick emergency response and decision-making. In this paper, we introduce a collaborative platform for distributed disaster reduction. To address the issues of imbalance of sharing data sources and technology in the process of disaster reduction, we propose a multi-role oriented collaboration business mechanism, which is capable of scheduling and allocating for optimum utilization of multiple resources, to link various roles for collaborative reduction business in different place. The platform fully considers the difference of equipment conditions in different provinces and provide several service modes to satisfy technology need in disaster reduction. An integrated collaboration system based on focusing services mechanism is designed and implemented for resource scheduling, functional integration, data processing, task management, collaborative mapping, and visualization. Actual applications illustrate that the platform can well support data sharing and business collaboration between national and provincial department. It could significantly improve the capability of disaster reduction in China.Keywords: business collaboration, data sharing, distributed disaster reduction, focusing service
Procedia PDF Downloads 29625433 Research of Data Cleaning Methods Based on Dependency Rules
Authors: Yang Bao, Shi Wei Deng, WangQun Lin
Abstract:
This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.Keywords: data cleaning, dependency rules, violation data discovery, data repair
Procedia PDF Downloads 56625432 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique
Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki
Abstract:
Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector
Procedia PDF Downloads 33925431 A Quantitative Analysis of Rural to Urban Migration in Morocco
Authors: Donald Wright
Abstract:
The ultimate goal of this study is to reinvigorate the philosophical underpinnings the study of urbanization with scientific data with the goal of circumventing what seems an inevitable future clash between rural and urban populations. To that end urban infrastructure must be sustainable economically, politically and ecologically over the course of several generations as cities continue to grow with the incorporation of climate refugees. Our research will provide data concerning the projected increase in population over the coming two decades in Morocco, and the population will shift from rural areas to urban centers during that period of time. As a result, urban infrastructure will need to be adapted, developed or built to fit the demand of future internal migrations from rural to urban centers in Morocco. This paper will also examine how past experiences of internally displaced people give insight into the challenges faced by future migrants and, beyond the gathering of data, how people react to internal migration. This study employs four different sets of research tools. First, a large part of this study is archival, which involves compiling the relevant literature on the topic and its complex history. This step also includes gathering data bout migrations in Morocco from public data sources. Once the datasets are collected, the next part of the project involves populating the attribute fields and preprocessing the data to make it understandable and usable by machine learning algorithms. In tandem with the mathematical interpretation of data and projected migrations, this study benefits from a theoretical understanding of the critical apparatus existing around urban development of the 20th and 21st centuries that give us insight into past infrastructure development and the rationale behind it. Once the data is ready to be analyzed, different machine learning algorithms will be experimented (k-clustering, support vector regression, random forest analysis) and the results compared for visualization of the data. The final computational part of this study involves analyzing the data and determining what we can learn from it. This paper helps us to understand future trends of population movements within and between regions of North Africa, which will have an impact on various sectors such as urban development, food distribution and water purification, not to mention the creation of public policy in the countries of this region. One of the strengths of this project is the multi-pronged and cross-disciplinary methodology to the research question, which enables an interchange of knowledge and experiences to facilitate innovative solutions to this complex problem. Multiple and diverse intersecting viewpoints allow an exchange of methodological models that provide fresh and informed interpretations of otherwise objective data.Keywords: climate change, machine learning, migration, Morocco, urban development
Procedia PDF Downloads 16125430 Volcanoscape Space Configuration Zoning Based on Disaster Mitigation by Utilizing GIS Platform in Mt. Krakatau Indonesia
Authors: Vega Erdiana Dwi Fransiska, Abyan Rai Fauzan Machmudin
Abstract:
Particularly, space configuration zoning is the very first juncture of a complete space configuration and region planning. Zoning is aimed to define discrete knowledge based on a local wisdom. Ancient predecessor scientifically study the sign of natural disaster towards ethnography approach by operating this knowledge. There are three main functions of space zoning, which are control function, guidance function, and additional function. The control function refers to an instrument for development control and as one of the essentials in controlling land use. Hence, the guidance function indicates as guidance for proposing operational planning and technical development or land usage. Any additional function is useful as a supplementary for region or province planning details. This phase likewise accredits to define boundary in an open space based on geographical appearance. Informant who is categorized as an elder lives in earthquake prone area, to be precise the area is the surrounding of Mount Krakatau. The collected data is one of method for analyzed with thematic model. Later on, it will be verified. In space zoning, long-range distance sensor is applied to determine visualization of the area, which will be zoned before the step of survey to validate the data. The data, which is obtained from long-range distance sensor and site survey, will be overlaid using GIS Platform. Comparing the knowledge based on a local wisdom that is well known by elderly in that area, some of it is relevant to the research, while the others are not. Based on the site survey, the interpretation of a long-range distance sensor, and determining space zoning by considering various aspects resulted in the pattern map of space zoning. This map can be integrated with disaster mitigation affected by volcano eruption.Keywords: elderly, GIS platform, local wisdom, space zoning
Procedia PDF Downloads 25725429 The Modification of the Mixed Flow Pump with Respect to Stability of the Head Curve
Authors: Roman Klas, František Pochylý, Pavel Rudolf
Abstract:
This paper is focused on the CFD simulation of the radiaxial pump (i.e. mixed flow pump) with the aim to detect the reasons of Y-Q characteristic instability. The main reasons of pressure pulsations were detected by means of the analysis of velocity and pressure fields within the pump combined with the theoretical approach. Consequently, the modifications of spiral case and pump suction area were made based on the knowledge of flow conditions and the shape of dissipation function. The primary design of pump geometry was created as the base model serving for the comparison of individual modification influences. The basic experimental data are available for this geometry. This approach replaced the more complicated and with respect to convergence of all computational tasks more difficult calculation for the compressible liquid flow. The modification of primary pump consisted in inserting the three fins types. Subsequently, the evaluation of pressure pulsations, specific energy curves and visualization of velocity fields were chosen as the criterion for successful design.Keywords: CFD, radiaxial pump, spiral case, stability
Procedia PDF Downloads 40025428 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity
Authors: Hoda A. Abdel Hafez
Abstract:
Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.Keywords: mining big data, big data, machine learning, telecommunication
Procedia PDF Downloads 41525427 Predictive Maintenance: Machine Condition Real-Time Monitoring and Failure Prediction
Authors: Yan Zhang
Abstract:
Predictive maintenance is a technique to predict when an in-service machine will fail so that maintenance can be planned in advance. Analytics-driven predictive maintenance is gaining increasing attention in many industries such as manufacturing, utilities, aerospace, etc., along with the emerging demand of Internet of Things (IoT) applications and the maturity of technologies that support Big Data storage and processing. This study aims to build an end-to-end analytics solution that includes both real-time machine condition monitoring and machine learning based predictive analytics capabilities. The goal is to showcase a general predictive maintenance solution architecture, which suggests how the data generated from field machines can be collected, transmitted, stored, and analyzed. We use a publicly available aircraft engine run-to-failure dataset to illustrate the streaming analytics component and the batch failure prediction component. We outline the contributions of this study from four aspects. First, we compare the predictive maintenance problems from the view of the traditional reliability centered maintenance field, and from the view of the IoT applications. When evolving to the IoT era, predictive maintenance has shifted its focus from ensuring reliable machine operations to improve production/maintenance efficiency via any maintenance related tasks. It covers a variety of topics, including but not limited to: failure prediction, fault forecasting, failure detection and diagnosis, and recommendation of maintenance actions after failure. Second, we review the state-of-art technologies that enable a machine/device to transmit data all the way through the Cloud for storage and advanced analytics. These technologies vary drastically mainly based on the power source and functionality of the devices. For example, a consumer machine such as an elevator uses completely different data transmission protocols comparing to the sensor units in an environmental sensor network. The former may transfer data into the Cloud via WiFi directly. The latter usually uses radio communication inherent the network, and the data is stored in a staging data node before it can be transmitted into the Cloud when necessary. Third, we illustrate show to formulate a machine learning problem to predict machine fault/failures. By showing a step-by-step process of data labeling, feature engineering, model construction and evaluation, we share following experiences: (1) what are the specific data quality issues that have crucial impact on predictive maintenance use cases; (2) how to train and evaluate a model when training data contains inter-dependent records. Four, we review the tools available to build such a data pipeline that digests the data and produce insights. We show the tools we use including data injection, streaming data processing, machine learning model training, and the tool that coordinates/schedules different jobs. In addition, we show the visualization tool that creates rich data visualizations for both real-time insights and prediction results. To conclude, there are two key takeaways from this study. (1) It summarizes the landscape and challenges of predictive maintenance applications. (2) It takes an example in aerospace with publicly available data to illustrate each component in the proposed data pipeline and showcases how the solution can be deployed as a live demo.Keywords: Internet of Things, machine learning, predictive maintenance, streaming data
Procedia PDF Downloads 39025426 Mobile Augmented Reality for Collaboration in Operation
Authors: Chong-Yang Qiao
Abstract:
Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.Keywords: mobile augmented reality, remote collaboration, user experience, cognition model
Procedia PDF Downloads 19825425 A World Map of Seabed Sediment Based on 50 Years of Knowledge
Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès
Abstract:
Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.Keywords: marine sedimentology, seabed map, sediment classification, world ocean
Procedia PDF Downloads 236