Search results for: data visualization
25135 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 43125134 Business Intelligence for Profiling of Telecommunication Customer
Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro
Abstract:
Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.Keywords: business intelligence, customer segmentation, data warehouse, data mining
Procedia PDF Downloads 48325133 Web Map Service for Fragmentary Rockfall Inventory
Authors: M. Amparo Nunez-Andres, Nieves Lantada
Abstract:
One of the most harmful geological risks is rockfalls. They cause both economic lost, damaged in buildings and infrastructures, and personal ones. Therefore, in order to estimate the risk of the exposed elements, it is necessary to know the mechanism of this kind of events, since the characteristics of the rock walls, to the propagation of fragments generated by the initial detached rock mass. In the framework of the research RockModels project, several inventories of rockfalls were carried out along the northeast of the Spanish peninsula and the Mallorca island. These inventories have general information about the events, although the important fact is that they contained detailed information about fragmentation. Specifically, the IBSD (Insitu Block Size Distribution) is obtained by photogrammetry from drone or TLS (Terrestrial Laser Scanner) and the RBSD (Rock Block Size Distribution) from the volume of the fragment in the deposit measured by hand. In order to share all this information with other scientists, engineers, members of civil protection, and stakeholders, it is necessary a platform accessible from the internet and following interoperable standards. In all the process, open-software have been used: PostGIS 2.1., Geoserver, and OpenLayers library. In the first step, a spatial database was implemented to manage all the information. We have used the data specifications of INSPIRE for natural risks adding specific and detailed data about fragmentation distribution. The next step was to develop a WMS with Geoserver. A previous phase was the creation of several views in PostGIS to show the information at different scales of visualization and with different degrees of detail. In the first view, the sites are identified with a point, and basic information about the rockfall event is facilitated. In the next level of zoom, at medium scale, the convex hull of the rockfall appears with its real shape and the source of the event and fragments are represented by symbols. The queries at this level offer a major detail about the movement. Eventually, the third level shows all elements: deposit, source, and blocks, in their real size, if it is possible, and in their real localization. The last task was the publication of all information in a web mapping site (www.rockdb.upc.edu) with data classified by levels using libraries in JavaScript as OpenLayers.Keywords: geological risk, web mapping, WMS, rockfalls
Procedia PDF Downloads 16025132 Visualization of Malaysia Universities Websites Based On Social Network Analysis
Authors: N. A. Ismail, Abdul Arif, Sharul Hafiz, Lu S. J., Tham W. S., Wong S. K.
Abstract:
This paper investigates the visulization of Malaysia universities websites. Twenty (20) public universities websites in Malaysia has been chosen as samples to explore and visualize the link relationship between their academic websites using social network analysis methods such as inlink, degree, weight, betweenness and modularity class. All of the connection and relation demonstrate the power to influence, comprehensive strength and also the variety of subject types that are present in universities. The experimental results also show that University Malaysia Sabah (UMS) is the biggest back links provider.Keywords: academic websites, link analysis, social network analysis, experimental result
Procedia PDF Downloads 47125131 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis
Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski
Abstract:
The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.Keywords: cloud service, geodata cube, multiresolution, raster geodata
Procedia PDF Downloads 13525130 Integrated Mathematical Modeling and Advance Visualization of Magnetic Nanoparticle for Drug Delivery, Drug Release and Effects to Cancer Cell Treatment
Authors: Norma Binti Alias, Che Rahim Che The, Norfarizan Mohd Said, Sakinah Abdul Hanan, Akhtar Ali
Abstract:
This paper discusses on the transportation of magnetic drug targeting through blood within vessels, tissues and cells. There are three integrated mathematical models to be discussed and analyze the concentration of drug and blood flow through magnetic nanoparticles. The cell therapy brought advancement in the field of nanotechnology to fight against the tumors. The systematic therapeutic effect of Single Cells can reduce the growth of cancer tissue. The process of this nanoscale phenomena system is able to measure and to model, by identifying some parameters and applying fundamental principles of mathematical modeling and simulation. The mathematical modeling of single cell growth depends on three types of cell densities such as proliferative, quiescent and necrotic cells. The aim of this paper is to enhance the simulation of three types of models. The first model represents the transport of drugs by coupled partial differential equations (PDEs) with 3D parabolic type in a cylindrical coordinate system. This model is integrated by Non-Newtonian flow equations, leading to blood liquid flow as the medium for transportation system and the magnetic force on the magnetic nanoparticles. The interaction between the magnetic force on drug with magnetic properties produces induced currents and the applied magnetic field yields forces with tend to move slowly the movement of blood and bring the drug to the cancer cells. The devices of nanoscale allow the drug to discharge the blood vessels and even spread out through the tissue and access to the cancer cells. The second model is the transport of drug nanoparticles from the vascular system to a single cell. The treatment of the vascular system encounters some parameter identification such as magnetic nanoparticle targeted delivery, blood flow, momentum transport, density and viscosity for drug and blood medium, intensity of magnetic fields and the radius of the capillary. Based on two discretization techniques, finite difference method (FDM) and finite element method (FEM), the set of integrated models are transformed into a series of grid points to get a large system of equations. The third model is a single cell density model involving the three sets of first order PDEs equations for proliferating, quiescent and necrotic cells change over time and space in Cartesian coordinate which regulates under different rates of nutrients consumptions. The model presents the proliferative and quiescent cell growth depends on some parameter changes and the necrotic cells emerged as the tumor core. Some numerical schemes for solving the system of equations are compared and analyzed. Simulation and computation of the discretized model are supported by Matlab and C programming languages on a single processing unit. Some numerical results and analysis of the algorithms are presented in terms of informative presentation of tables, multiple graph and multidimensional visualization. As a conclusion, the integrated of three types mathematical modeling and the comparison of numerical performance indicates that the superior tool and analysis for solving the complete set of magnetic drug delivery system which give significant effects on the growth of the targeted cancer cell.Keywords: mathematical modeling, visualization, PDE models, magnetic nanoparticle drug delivery model, drug release model, single cell effects, avascular tumor growth, numerical analysis
Procedia PDF Downloads 42825129 Improving Performance and Progression of Novice Programmers: Factors Considerations
Authors: Hala Shaari, Nuredin Ahmed
Abstract:
Teaching computer programming is recognized to be difficult and a real challenge. The biggest problem faced by novice programmers is their lack of understanding of basic programming concepts. A visualized learning tool was developed and used by volunteered first-year students for two semesters. The purposes of this paper are firstly, to emphasize factors which directly affect the performance of our students negatively. Secondly, to examine whether the proposed tool would improve their performance and learning progression. The results of adopting this tool were conducted using a pre-survey and post-survey questionnaire. As a result, students who used the learning tool showed better performance in their programming subject.Keywords: factors, novice, programming, visualization
Procedia PDF Downloads 36325128 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.Keywords: DNA microarray, feature selection, missing data, bioinformatics
Procedia PDF Downloads 57425127 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework
Authors: Lutful Karim, Mohammed S. Al-kahtani
Abstract:
Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.Keywords: big data, clustering, tree topology, data aggregation, sensor networks
Procedia PDF Downloads 34625126 Improving the Flow Capacity (CV) of the Valves
Authors: Pradeep A. G, Gorantla Giridhar, Vijay Turaga, Vinod Srinivasa
Abstract:
The major problem in the flow control valve is of lower Cv, which will reduce the overall efficiency of the flow circuit. Designers are continuously working to improve the Cv of the valve, but they need to validate the design ideas they have regarding the improvement of Cv. The traditional method of prototyping and testing takes a lot of time. That is where CFD comes into the picture with very quick and accurate validation along with visualization, which is not possible with the traditional testing method. We have developed a method to predict Cv value using CFD analysis by iterating on various Boundary conditions, solver settings and by carrying out grid convergence studies to establish the correlation between the CFD model and Test data. The present study investigates 3 different ideas put forward by the designers for improving the flow capacity of the valves, like reducing the cage thickness, changing the port position, and using the parabolic plug to guide the flow. Using CFD, we analyzed all design changes using the established methodology that we developed. We were able to evaluate the effect of these design changes on the Valve Cv. We optimized the wetted surface of the valve further by suggesting the design modification to the lower part of the valve to make the flow more streamlined. We could find that changing cage thickness and port position has little impact on the valve Cv. The combination of optimized wetted surface and introduction of parabolic plug improved the Flow capacity (Cv) of the valve significantly.Keywords: flow control valves, flow capacity (Cv), CFD simulations, design validation
Procedia PDF Downloads 16425125 The Malfatti’s Problem in Reuleaux Triangle
Authors: Ching-Shoei Chiang
Abstract:
The Malfatti’s Problem is to ask for fitting 3 circles into a right triangle such that they are tangent to each other, and each circle is also tangent to a pair of the triangle’s side. This problem has been extended to any triangle (called general Malfatti’s Problem). Furthermore, the problem has been extended to have 1+2+…+n circles, we call it extended general Malfatti’s problem, these circles whose tangency graph, using the center of circles as vertices and the edge connect two circles center if these two circles tangent to each other, has the structure as Pascal’s triangle, and the exterior circles of these circles tangent to three sides of the triangle. In the extended general Malfatti’s problem, there are closed-form solutions for n=1, 2, and the problem becomes complex when n is greater than 2. In solving extended general Malfatti’s problem (n>2), we initially give values to the radii of all circles. From the tangency graph and current radii, we can compute angle value between two vectors. These vectors are from the center of the circle to the tangency points with surrounding elements, and these surrounding elements can be the boundary of the triangle or other circles. For each circle C, there are vectors from its center c to its tangency point with its neighbors (count clockwise) pi, i=0, 1,2,..,n. We add all angles between cpi to cp(i+1) mod (n+1), i=0,1,..,n, call it sumangle(C) for circle C. Using sumangle(C), we can reduce/enlarge the radii for all circles in next iteration, until sumangle(C) is equal to 2πfor all circles. With a similar idea, this paper proposed an algorithm to find the radii of circles whose tangency has the structure of Pascal’s triangle, and the exterior circles of these circles are tangent to the unit Realeaux Triangle.Keywords: Malfatti’s problem, geometric constraint solver, computer-aided geometric design, circle packing, data visualization
Procedia PDF Downloads 13225124 Interactive Multiple Functions User Interface
Authors: Manjit Singh Sidhu, Waleed Maqableh, Jee Geak Ying
Abstract:
Tangible user interfaces (TUI) that employ markers in the augmented reality (AR) environment has hampered the interactivity between the user and the software application. This is because the user lacks focus on visualizing the contents due to the interaction mechanisms whereby multiple markers may need to be used to perform a particular function. In this research, we have designed a novel TUI user interface where multiple functions could be triggered similar to a natural keyboard thus allowing user to focus more on its digital contents such as 2D/3D, text input, animation and sound. Test results of the user interface with potential users and HCI experts revealed that the multiple functions user interface was new, preferred and appreciated more as opposed to marker based user interface.Keywords: multimedia, augmented reality, engineering, user interface, visualization
Procedia PDF Downloads 44825123 Harnessing Emerging Creative Technology for Knowledge Discovery of Multiwavelenght Datasets
Authors: Basiru Amuneni
Abstract:
Astronomy is one domain with a rise in data. Traditional tools for data management have been employed in the quest for knowledge discovery. However, these traditional tools become limited in the face of big. One means of maximizing knowledge discovery for big data is the use of scientific visualisation. The aim of the work is to explore the possibilities offered by emerging creative technologies of Virtual Reality (VR) systems and game engines to visualize multiwavelength datasets. Game Engines are primarily used for developing video games, however their advanced graphics could be exploited for scientific visualization which provides a means to graphically illustrate scientific data to ease human comprehension. Modern astronomy is now in the era of multiwavelength data where a single galaxy for example, is captured by the telescope several times and at different electromagnetic wavelength to have a more comprehensive picture of the physical characteristics of the galaxy. Visualising this in an immersive environment would be more intuitive and natural for an observer. This work presents a standalone VR application that accesses galaxy FITS files. The application was built using the Unity Game Engine for the graphics underpinning and the OpenXR API for the VR infrastructure. The work used a methodology known as Design Science Research (DSR) which entails the act of ‘using design as a research method or technique’. The key stages of the galaxy modelling pipeline are FITS data preparation, Galaxy Modelling, Unity 3D Visualisation and VR Display. The FITS data format cannot be read by the Unity Game Engine directly. A DLL (CSHARPFITS) which provides a native support for reading and writing FITS files was used. The Galaxy modeller uses an approach that integrates cleaned FITS image pixels into the graphics pipeline of the Unity3d game Engine. The cleaned FITS images are then input to the galaxy modeller pipeline phase, which has a pre-processing script that extracts, pixel, galaxy world position, and colour maps the FITS image pixels. The user can visualise image galaxies in different light bands, control the blend of the image with similar images from different sources or fuse images for a holistic view. The framework will allow users to build tools to realise complex workflows for public outreach and possibly scientific work with increased scalability, near real time interactivity with ease of access. The application is presented in an immersive environment and can use all commercially available headset built on the OpenXR API. The user can select galaxies in the scene, teleport to the galaxy, pan, zoom in/out, and change colour gradients of the galaxy. The findings and design lessons learnt in the implementation of different use cases will contribute to the development and design of game-based visualisation tools in immersive environment by enabling informed decisions to be made.Keywords: astronomy, visualisation, multiwavelenght dataset, virtual reality
Procedia PDF Downloads 9125122 Assessment of Barriers Influencing the Adoption of Building Information Modelling in the Construction Industry, Lagos State, Nigeria
Authors: Tosin Deborah Akanbi, Adeyemi Oluwaseun Adepoju, Hameed Olusegun Adebambo, Akinloye Fatai Lawal
Abstract:
Building information modelling (BIM) is a process that starts with the development of a sequential 3D design and encourages data administration, organization, and visualization throughout the life span of a facility (drawings, construction, and supervision). The implementation of building information modelling has been slow in recent years, and this is due to some prominent barriers that hinder its adoption. In this regard, the study aims to examine the significant barriers that influence the adoption of building information modelling in the Lagos state construction industry. Data were gathered through a questionnaire survey with 332 construction professionals in the study area. Three online structured interviews were conducted to support and validate the findings of the quantitative analysis. The results revealed that interest (lack of awareness and understanding of BIM, absence of in-house BIM competent professionals, and unavailability of BIM competent professionals in the labour market), legal (lack of policies and regulations on copyright ownership and lack of enforcement from government agencies and industry leaderships) and professional (people’s inability or refusal to learn new technologies and processes, waste in time and human resource and lack of clarity of professional roles in BIM) barriers are the major barriers influencing the adoption of BIM. The results also revealed that six final themes were generated, namely: finance barriers, industry barriers, interest barriers, leadership barriers, legal barriers, and professional barriers. Thus, there is a need for policymakers to design and implement policies (regulatory, economic, and information) to promote financial schemes to support construction firms and professionals and to reduce financial barriers. It is also important for the government to lay down rules and regulations that must be enforced among the construction professionals and firms in the Lagos state construction industry.Keywords: BIM barriers, BIM adoption characteristics, construction industry, Lagos State Nigeria
Procedia PDF Downloads 5025121 Spatial Differentiation of Elderly Care Facilities in Mountainous Cities: A Case Study of Chongqing
Abstract:
In this study, a web crawler was used to collect POI sample data from 38 districts and counties of Chongqing in 2022, and ArcGIS was combined to coordinate and projection conversion and realize data visualization. Nuclear density analysis and spatial correlation analysis were used to explore the spatial distribution characteristics of elderly care facilities in Chongqing, and K mean cluster analysis was carried out with GeoDa to study the spatial concentration degree of elderly care resources in 38 districts and counties. Finally, the driving force of spatial differentiation of elderly care facilities in various districts and counties of Chongqing is studied by using the method of geographic detector. The results show that: (1) in terms of spatial distribution structure, the distribution of elderly care facilities in Chongqing is unbalanced, showing a distribution pattern of ‘large dispersion and small agglomeration’ and the asymmetric pattern of ‘west dense and east sparse, north dense and south sparse’ is prominent. (2) In terms of the spatial matching between elderly care resources and the elderly population, there is a weak coordination between the input of elderly care resources and the distribution of the elderly population at the county level in Chongqing. (3) The analysis of the results of the geographical detector shows that the single factor influence is mainly the number of elderly population, public financial revenue and district and county GDP. The high single factor influence is mainly caused by the elderly population, public financial income, and district and county GDP. The influence of each influence factor on the spatial distribution of elderly care facilities is not simply superimposed but has a nonlinear enhancement effect or double factor enhancement. It is necessary to strengthen the synergistic effect of two factors and promote the synergistic effect of multiple factors.Keywords: aging, elderly care facilities, spatial differentiation, geographical detector, driving force analysis, Mountain city
Procedia PDF Downloads 3825120 An Approach for Vocal Register Recognition Based on Spectral Analysis of Singing
Authors: Aleksandra Zysk, Pawel Badura
Abstract:
Recognizing and controlling vocal registers during singing is a difficult task for beginner vocalist. It requires among others identifying which part of natural resonators is being used when a sound propagates through the body. Thus, an application has been designed allowing for sound recording, automatic vocal register recognition (VRR), and a graphical user interface providing real-time visualization of the signal and recognition results. Six spectral features are determined for each time frame and passed to the support vector machine classifier yielding a binary decision on the head or chest register assignment of the segment. The classification training and testing data have been recorded by ten professional female singers (soprano, aged 19-29) performing sounds for both chest and head register. The classification accuracy exceeded 93% in each of various validation schemes. Apart from a hard two-class clustering, the support vector classifier returns also information on the distance between particular feature vector and the discrimination hyperplane in a feature space. Such an information reflects the level of certainty of the vocal register classification in a fuzzy way. Thus, the designed recognition and training application is able to assess and visualize the continuous trend in singing in a user-friendly graphical mode providing an easy way to control the vocal emission.Keywords: classification, singing, spectral analysis, vocal emission, vocal register
Procedia PDF Downloads 30425119 Information Tree: Establishment of Lifestyle-Based IT Visual Model
Authors: Chiung-Hui Chen
Abstract:
Traditional service channel is losing its edge due to emerging service technology. To establish interaction with the clients, the service industry is using effective mechanism to give clients direct access to services with emerging technologies. Thus, as service science receives attention, special and unique consumption pattern evolves; henceforth, leading to new market mechanism and influencing attitudes toward life and consumption patterns. The market demand for customized services is thus valued due to the emphasis of personal value, and is gradually changing the demand and supply relationship in the traditional industry. In respect of interior design service, in the process of traditional interior design, a designer converts to a concrete form the concept generated from the ideas and needs dictated by a user (client), by using his/her professional knowledge and drawing tool. The final product is generated through iterations of communication and modification, which is a very time-consuming process. Although this process has been accelerated with the help of computer graphics software today, repeated discussions and confirmations with users are still required to complete the task. In consideration of what is addressed above a space user’s life model is analyzed with visualization technique to create an interaction system modeled after interior design knowledge. The space user document intuitively personal life experience in a model requirement chart, allowing a researcher to analyze interrelation between analysis documents, identify the logic and the substance of data conversion. The repeated data which is documented are then transformed into design information for reuse and sharing. A professional interior designer may sort out the correlation among user’s preference, life pattern and design specification, thus deciding the critical design elements in the process of service design.Keywords: information design, life model-based, aesthetic computing, communication
Procedia PDF Downloads 29825118 Control the Flow of Big Data
Authors: Shizra Waris, Saleem Akhtar
Abstract:
Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.Keywords: computer, it community, industry, big data
Procedia PDF Downloads 19425117 Third Eye: A Hybrid Portrayal of Visuospatial Attention through Eye Tracking Research and Modular Arithmetic
Authors: Shareefa Abdullah Al-Maqtari, Ruzaika Omar Basaree, Rafeah Legino
Abstract:
A pictorial representation of hybrid forms in science-art collaboration has become a crucial issue in the course of exploring a new painting technique development. This is straight related to the reception of an invisible-recognition phenomenology. In hybrid pictorial representation of invisible-recognition phenomenology, the challenging issue is how to depict the pictorial features of indescribable objects from its mental source, modality and transparency. This paper proposes the hybrid technique of painting Demonstrate, Resemble, and Synthesize (DRS) through a combination of the hybrid aspect-recognition representation of understanding picture, demonstrative mod, the number theory, pattern in the modular arithmetic system, and the coherence theory of visual attention in the dynamic scenes representation. Multi-methods digital gaze data analyses, pattern-modular table operation design, and rotation parameter were used for the visualization. In the scientific processes, Eye-trackingvideo-sections based was conducted using Tobii T60 remote eye tracking hardware and TobiiStudioTM analysis software to collect and analyze the eye movements of ten participants when watching the video clip, Alexander Paulikevitch’s performance’s ‘Tajwal’. Results: we found that correlation of fixation count in section one was positively and moderately correlated with section two Person’s (r=.10, p < .05, 2-tailed) as well as in fixation duration Person’s (r=.10, p < .05, 2-tailed). However, a paired-samples t-test indicates that scores were significantly higher for the section one (M = 2.2, SD = .6) than for the section two (M = 1.93, SD = .6) t(9) = 2.44, p < .05, d = 0.87. In the visual process, the exported data of gaze number N was resembled the hybrid forms of visuospatial attention using the table-mod-analyses operation. The explored hybrid guideline was simply applicable, and it could be as alternative approach to the sustainability of contemporary visual arts.Keywords: science-art collaboration, hybrid forms, pictorial representation, visuospatial attention, modular arithmetic
Procedia PDF Downloads 36425116 High Performance Computing and Big Data Analytics
Authors: Branci Sarra, Branci Saadia
Abstract:
Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.Keywords: high performance computing, HPC, big data, data analysis
Procedia PDF Downloads 52025115 Micro-CT Imaging Of Hard Tissues
Authors: Amir Davood Elmi
Abstract:
From the earliest light microscope to the most innovative X-ray imaging techniques, all of them have refined and improved our knowledge about the organization and composition of living tissues. The old techniques are time consuming and ultimately destructive to the tissues under the examination. In recent few decades, thanks to the boost of technology, non-destructive visualization techniques, such as X-ray computed tomography (CT), magnetic resonance imaging (MRI), selective plane illumination microscopy (SPIM), and optical projection tomography (OPT), have come to the forefront. Among these techniques, CT is excellent for mineralized tissues such as bone or dentine. In addition, CT it is faster than other aforementioned techniques and the sample remains intact. In this article, applications, advantages, and limitations of micro-CT is discussed, in addition to some information about micro-CT of soft tissue.Keywords: Micro-CT, hard tissue, bone, attenuation coefficient, rapid prototyping
Procedia PDF Downloads 14225114 A Landscape of Research Data Repositories in Re3data.org Registry: A Case Study of Indian Repositories
Authors: Prashant Shrivastava
Abstract:
The purpose of this study is to explore re3dat.org registry to identify research data repositories registration workflow process. Further objective is to depict a graph for present development of research data repositories in India. Preliminarily with an approach to understand re3data.org registry framework and schema design then further proceed to explore the status of research data repositories of India in re3data.org registry. Research data repositories are getting wider relevance due to e-research concepts. Now available registry re3data.org is a good tool for users and researchers to identify appropriate research data repositories as per their research requirements. In Indian environment, a compatible National Research Data Policy is the need of the time to boost the management of research data. Registry for Research Data Repositories is a crucial tool to discover specific information in specific domain. Also, Research Data Repositories in India have not been studied. Re3data.org registry and status of Indian research data repositories both discussed in this study.Keywords: research data, research data repositories, research data registry, re3data.org
Procedia PDF Downloads 32425113 A Study of Cloud Computing Solution for Transportation Big Data Processing
Authors: Ilgin Gökaşar, Saman Ghaffarian
Abstract:
The need for fast processed big data of transportation ridership (eg., smartcard data) and traffic operation (e.g., traffic detectors data) which requires a lot of computational power is incontrovertible in Intelligent Transportation Systems. Nowadays cloud computing is one of the important subjects and popular information technology solution for data processing. It enables users to process enormous measure of data without having their own particular computing power. Thus, it can also be a good selection for transportation big data processing as well. This paper intends to examine how the cloud computing can enhance transportation big data process with contrasting its advantages and disadvantages, and discussing cloud computing features.Keywords: big data, cloud computing, Intelligent Transportation Systems, ITS, traffic data processing
Procedia PDF Downloads 46725112 Frequency- and Content-Based Tag Cloud Font Distribution Algorithm
Authors: Ágnes Bogárdi-Mészöly, Takeshi Hashimoto, Shohei Yokoyama, Hiroshi Ishikawa
Abstract:
The spread of Web 2.0 has caused user-generated content explosion. Users can tag resources to describe and organize them. Tag clouds provide rough impression of relative importance of each tag within overall cloud in order to facilitate browsing among numerous tags and resources. The goal of our paper is to enrich visualization of tag clouds. A font distribution algorithm has been proposed to calculate a novel metric based on frequency and content, and to classify among classes from this metric based on power law distribution and percentages. The suggested algorithm has been validated and verified on the tag cloud of a real-world thesis portal.Keywords: tag cloud, font distribution algorithm, frequency-based, content-based, power law
Procedia PDF Downloads 50525111 Harmonic Data Preparation for Clustering and Classification
Authors: Ali Asheibi
Abstract:
The rapid increase in the size of databases required to store power quality monitoring data has demanded new techniques for analysing and understanding the data. One suggested technique to assist in analysis is data mining. Preparing raw data to be ready for data mining exploration take up most of the effort and time spent in the whole data mining process. Clustering is an important technique in data mining and machine learning in which underlying and meaningful groups of data are discovered. Large amounts of harmonic data have been collected from an actual harmonic monitoring system in a distribution system in Australia for three years. This amount of acquired data makes it difficult to identify operational events that significantly impact the harmonics generated on the system. In this paper, harmonic data preparation processes to better understanding of the data have been presented. Underlying classes in this data has then been identified using clustering technique based on the Minimum Message Length (MML) method. The underlying operational information contained within the clusters can be rapidly visualised by the engineers. The C5.0 algorithm was used for classification and interpretation of the generated clusters.Keywords: data mining, harmonic data, clustering, classification
Procedia PDF Downloads 24825110 Linguistic Summarization of Structured Patent Data
Authors: E. Y. Igde, S. Aydogan, F. E. Boran, D. Akay
Abstract:
Patent data have an increasingly important role in economic growth, innovation, technical advantages and business strategies and even in countries competitions. Analyzing of patent data is crucial since patents cover large part of all technological information of the world. In this paper, we have used the linguistic summarization technique to prove the validity of the hypotheses related to patent data stated in the literature.Keywords: data mining, fuzzy sets, linguistic summarization, patent data
Procedia PDF Downloads 27225109 Proposal of Data Collection from Probes
Authors: M. Kebisek, L. Spendla, M. Kopcek, T. Skulavik
Abstract:
In our paper we describe the security capabilities of data collection. Data are collected with probes located in the near and distant surroundings of the company. Considering the numerous obstacles e.g. forests, hills, urban areas, the data collection is realized in several ways. The collection of data uses connection via wireless communication, LAN network, GSM network and in certain areas data are collected by using vehicles. In order to ensure the connection to the server most of the probes have ability to communicate in several ways. Collected data are archived and subsequently used in supervisory applications. To ensure the collection of the required data, it is necessary to propose algorithms that will allow the probes to select suitable communication channel.Keywords: communication, computer network, data collection, probe
Procedia PDF Downloads 36025108 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans
Authors: Rene Hellmuth
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.Keywords: building information modeling, digital factory model, factory planning, restructuring
Procedia PDF Downloads 11425107 A Review on Big Data Movement with Different Approaches
Authors: Nay Myo Sandar
Abstract:
With the growth of technologies and applications, a large amount of data has been producing at increasing rate from various resources such as social media networks, sensor devices, and other information serving devices. This large collection of massive, complex and exponential growth of dataset is called big data. The traditional database systems cannot store and process such data due to large and complexity. Consequently, cloud computing is a potential solution for data storage and processing since it can provide a pool of resources for servers and storage. However, moving large amount of data to and from is a challenging issue since it can encounter a high latency due to large data size. With respect to big data movement problem, this paper reviews the literature of previous works, discusses about research issues, finds out approaches for dealing with big data movement problem.Keywords: Big Data, Cloud Computing, Big Data Movement, Network Techniques
Procedia PDF Downloads 8625106 Optimized Approach for Secure Data Sharing in Distributed Database
Authors: Ahmed Mateen, Zhu Qingsheng, Ahmad Bilal
Abstract:
In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.Keywords: ER-schema, electronic record, P2P framework, API, query formulation
Procedia PDF Downloads 333