Search results for: image and telemetric data
25343 Study on the Thermal Mixing of Steam and Coolant in the Hybrid Safety Injection Tank
Authors: Sung Uk Ryu, Byoung Gook Jeon, Sung-Jae Yi, Dong-Jin Euh
Abstract:
In such passive safety injection systems in the nuclear power plant as Core Makeup Tank (CMT) and Hybrid Safety Injection Tank, various thermal-hydraulic phenomena including the direct contact condensation of steam and the thermal stratification of coolant occur. These phenomena are also closely related to the performance of the system. Depending on the condensation rate of the steam injected to the tank, the injection of the coolant and pressure equalizing timings of the tank are decided. The steam injected to the tank from the upper nozzle penetrates the coolant and induces a direct contact condensation. In the present study, the direct contact condensation of steam and the thermal mixing between the steam and coolant were examined by using the Particle Image Velocimetry (PIV) technique. Especially, by altering the size of the nozzle from which the steam is injected, the influence of steam injection velocity on the thermal mixing with coolant and condensation shall be comprehended, while also investigating the influence of condensation on the pressure variation inside the tank. Even though the amounts of steam inserted were the same in three different nozzle size conditions, it was found that the velocity of pressure rise becomes lower as the steam injection area decreases. Also, as the steam injection area increases, the thickness of the zone within which the coolant’s temperature decreases. Thereby, the amount of steam condensed by the direct contact condensation also decreases. The results derived from the present study can be utilized for the detailed design of a passive safety injection system, as well as for modeling the direct contact condensation triggered by the steam jet’s penetration into the coolant.Keywords: passive safety injection systems, steam penetration, direct contact condensation, particle image velocimetry
Procedia PDF Downloads 39725342 Distributed Perceptually Important Point Identification for Time Series Data Mining
Authors: Tak-Chung Fu, Ying-Kit Hung, Fu-Lai Chung
Abstract:
In the field of time series data mining, the concept of the Perceptually Important Point (PIP) identification process is first introduced in 2001. This process originally works for financial time series pattern matching and it is then found suitable for time series dimensionality reduction and representation. Its strength is on preserving the overall shape of the time series by identifying the salient points in it. With the rise of Big Data, time series data contributes a major proportion, especially on the data which generates by sensors in the Internet of Things (IoT) environment. According to the nature of PIP identification and the successful cases, it is worth to further explore the opportunity to apply PIP in time series ‘Big Data’. However, the performance of PIP identification is always considered as the limitation when dealing with ‘Big’ time series data. In this paper, two distributed versions of PIP identification based on the Specialized Binary (SB) Tree are proposed. The proposed approaches solve the bottleneck when running the PIP identification process in a standalone computer. Improvement in term of speed is obtained by the distributed versions.Keywords: distributed computing, performance analysis, Perceptually Important Point identification, time series data mining
Procedia PDF Downloads 43825341 Statistical Shape Analysis of the Human Upper Airway
Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar
Abstract:
The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.Keywords: medical imaging, image processing, FEM/BEM, statistical modelling
Procedia PDF Downloads 51725340 Knowledge Discovery and Data Mining Techniques in Textile Industry
Authors: Filiz Ersoz, Taner Ersoz, Erkin Guler
Abstract:
This paper addresses the issues and technique for textile industry using data mining techniques. Data mining has been applied to the stitching of garments products that were obtained from a textile company. Data mining techniques were applied to the data obtained from the CHAID algorithm, CART algorithm, Regression Analysis and, Artificial Neural Networks. Classification technique based analyses were used while data mining and decision model about the production per person and variables affecting about production were found by this method. In the study, the results show that as the daily working time increases, the production per person also decreases. In addition, the relationship between total daily working and production per person shows a negative result and the production per person show the highest and negative relationship.Keywords: data mining, textile production, decision trees, classification
Procedia PDF Downloads 35625339 Balance of Natural Resources to Manage Land Use Changes in Subosukawonosraten Area
Authors: Sri E. Wati, D. Roswidyatmoko, N. Maslahatun, Gunawan, Andhika B. Taji
Abstract:
Natural resource is the main sources to fulfill human needs. Its utilization must consider not only human prosperity but also sustainability. Balance of natural resources is a tool to manage natural wealth and to control land use change. This tool is needed to organize land use planning as stated on spatial plan in a certain region. Balance of natural resources can be calculated by comparing two-series of natural resource data obtained at different year. In this case, four years data period of land and forest were used (2010 and 2014). Land use data were acquired through satellite image interpretation and field checking. By means of GIS analysis, its result was then assessed with land use plan. It is intended to evaluate whether existing land use is suitable with land use plan. If it is improper, what kind of efforts and policies must be done to overcome the situation. Subosukawonosraten is rapid developed areas in Central Java Province. This region consists of seven regencies/cities which are Sukoharjo Regency, Boyolali Regency, Surakarta City, Karanganyar Regency, Wonogiri Regency, Sragen Regency, and Klaten Regency. This region is regarding to several former areas under Karasidenan Surakarta and their location is adjacent to Surakarta. Balance of forest resources show that width of forest area is not significantly changed. Some land uses within the area are slightly changed. Some rice field areas are converted into settlement (0.03%) whereas water bodies become vacant areas (0.09%). On the other hand, balance of land resources state that there are many land use changes in this region. Width area of rice field decreases 428 hectares and more than 50% of them have been transformed into settlement area and 11.21% is converted into buildings such as factories, hotels, and other infrastructures. It occurs mostly in Sragen, Sukoharjo, and Karanganyar Regency. The results illustrate that land use change in this region is mostly influenced by increasing of population number. Some agricultural lands have been converted into built-up area since demand of settlement, industrial area, and other infrastructures also increases. Unfortunately, recent utilization of more than a half of total area is not appropriate with land use plan declared in spatial planning document. It means, local government shall develop a strict regulation and law enforcement related to any violation in land use management.Keywords: balance, forest, land, spatial plan
Procedia PDF Downloads 32225338 Investigation of Delivery of Triple Play Data in GE-PON Fiber to the Home Network
Authors: Ashima Anurag Sharma
Abstract:
Optical fiber based networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This research paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparison between various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be decreases due to increase in bit error rate.Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT
Procedia PDF Downloads 53025337 Microarray Gene Expression Data Dimensionality Reduction Using PCA
Authors: Fuad M. Alkoot
Abstract:
Different experimental technologies such as microarray sequencing have been proposed to generate high-resolution genetic data, in order to understand the complex dynamic interactions between complex diseases and the biological system components of genes and gene products. However, the generated samples have a very large dimension reaching thousands. Therefore, hindering all attempts to design a classifier system that can identify diseases based on such data. Additionally, the high overlap in the class distributions makes the task more difficult. The data we experiment with is generated for the identification of autism. It includes 142 samples, which is small compared to the large dimension of the data. The classifier systems trained on this data yield very low classification rates that are almost equivalent to a guess. We aim at reducing the data dimension and improve it for classification. Here, we experiment with applying a multistage PCA on the genetic data to reduce its dimensionality. Results show a significant improvement in the classification rates which increases the possibility of building an automated system for autism detection.Keywords: PCA, gene expression, dimensionality reduction, classification, autism
Procedia PDF Downloads 56325336 Enhanced Acquisition Time of a Quantum Holography Scheme within a Nonlinear Interferometer
Authors: Sergio Tovar-Pérez, Sebastian Töpfer, Markus Gräfe
Abstract:
The work proposes a technique that decreases the detection acquisition time of quantum holography schemes down to one-third; this allows the possibility to image moving objects. Since its invention, quantum holography with undetected photon schemes has gained interest in the scientific community. This is mainly due to its ability to tailor the detected wavelengths according to the needs of the scheme implementation. Yet this wavelength flexibility grants the scheme a wide range of possible applications; an important matter was yet to be addressed. Since the scheme uses digital phase-shifting techniques to retrieve the information of the object out of the interference pattern, it is necessary to acquire a set of at least four images of the interference pattern along with well-defined phase steps to recover the full object information. Hence, the imaging method requires larger acquisition times to produce well-resolved images. As a consequence, the measurement of moving objects remains out of the reach of the imaging scheme. This work presents the use and implementation of a spatial light modulator along with a digital holographic technique called quasi-parallel phase-shifting. This technique uses the spatial light modulator to build a structured phase image consisting of a chessboard pattern containing the different phase steps for digitally calculating the object information. Depending on the reduction in the number of needed frames, the acquisition time reduces by a significant factor. This technique opens the door to the implementation of the scheme for moving objects. In particular, the application of this scheme in imaging alive specimens comes one step closer.Keywords: quasi-parallel phase shifting, quantum imaging, quantum holography, quantum metrology
Procedia PDF Downloads 11725335 New Insights Into Fog Role In Atmospheric Deposition Using Satellite Images
Authors: Suruchi
Abstract:
This study aims to examine the spatial and temporal patterns of fog occurrences across Czech Republic. It utilizes satellite imagery and other data sources to achieve this goal. The main objective is to understand the role of fog in atmospheric deposition processes and its potential impact on the environment and ecosystems. Through satellite image analysis, the study will identify and categorize different types of fog, including radiation fog, orographic fog, and mountain fog. Fog detection algorithms and cloud type products will be evaluated to assess the frequency and distribution of fog events throughout the Czech Republic. Furthermore, the regions covered by fog will be classified based on their fog type and associated pollution levels. This will provide insights into the variability in fog characteristics and its implications for atmospheric deposition. Spatial analysis techniques will be used to pinpoint areas prone to frequent fog events and evaluate their pollution levels. Statistical methods will be employed to analyze patterns in fog occurrence over time and its connection with environmental factors. The ultimate goal of this research is to offer fresh perspectives on fog's role in atmospheric deposition processes, enhancing our understanding of its environmental significance and informing future research and environmental management initiatives.Keywords: pollution, GIS, FOG, satellie, atmospheric deposition
Procedia PDF Downloads 2525334 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 13225333 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 8125332 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0
Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini
Abstract:
Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling
Procedia PDF Downloads 9725331 Marine Ecosystem Mapping of Taman Laut Labuan: The First Habitat Mapping Effort to Support Marine Parks Management in Malaysia
Authors: K. Ismail, A. Ali, R. C. Hasan, I. Khalil, Z. Bachok, N. M. Said, A. M. Muslim, M. S. Che Din, W. S. Chong
Abstract:
The marine ecosystem in Malaysia holds invaluable potential in terms of economics, food security, pharmaceuticals components and protection from natural hazards. Although exploration of oil and gas industry and fisheries are active within Malaysian waters, knowledge of the seascape and ecological functioning of benthic habitats is still extremely poor in the marine parks around Malaysia due to the lack of detailed seafloor information. Consequently, it is difficult to manage marine resources effectively, protect ecologically important areas and set legislation to safeguard the marine parks. The limited baseline data hinders scientific linkage to support effective marine spatial management in Malaysia. This became the main driver behind the first seabed mapping effort at the national level. Taman Laut Labuan (TLL) is located to the west coast of Sabah and to the east of South China Sea. The total area of TLL is approximately 158.15 km2, comprises of three islands namely Pulau Kuraman, Rusukan Besar and Rusukan Kecil and is characterised by shallow fringing reef with few submerged shallow reef. The unfamiliar rocky shorelines limit the survey of multibeam echosounder to area with depth more than 10 m. Whereas, singlebeam and side scan sonar systems were used to acquire the data for area with depth less than 10 m. By integrating data from multibeam bathymetry and backscatter with singlebeam bathymetry and side sonar images, we produce a substrate map and coral coverage map for the TLL using i) marine landscape mapping technique and ii) RSOBIA ArcGIS toolbar (developed by T. Le Bas). We take the initiative to explore the ability of aerial drone and satellite image (WorldView-3) to derive the depths and substrate type within the intertidal and subtidal zone where it is not accessible via acoustic mapping. Although the coverage was limited, the outcome showed a promising technique to be incorporated towards establishing a guideline to facilitate a standard practice for efficient marine spatial management in Malaysia.Keywords: habitat mapping, marine spatial management, South China Sea, National seabed mapping
Procedia PDF Downloads 22925330 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme
Procedia PDF Downloads 38325329 Emotion Recognition Using Artificial Intelligence
Authors: Rahul Mohite, Lahcen Ouarbya
Abstract:
This paper focuses on the interplay between humans and computer systems and the ability of these systems to understand and respond to human emotions, including non-verbal communication. Current emotion recognition systems are based solely on either facial or verbal expressions. The limitation of these systems is that it requires large training data sets. The paper proposes a system for recognizing human emotions that combines both speech and emotion recognition. The system utilizes advanced techniques such as deep learning and image recognition to identify facial expressions and comprehend emotions. The results show that the proposed system, based on the combination of facial expression and speech, outperforms existing ones, which are based solely either on facial or verbal expressions. The proposed system detects human emotion with an accuracy of 86%, whereas the existing systems have an accuracy of 70% using verbal expression only and 76% using facial expression only. In this paper, the increasing significance and demand for facial recognition technology in emotion recognition are also discussed.Keywords: facial reputation, expression reputation, deep gaining knowledge of, photo reputation, facial technology, sign processing, photo type
Procedia PDF Downloads 12725328 The Image of Future Spouse in Indonesian Folktale: Man's Acceptance of Woman and vice Versa
Authors: Titik Wahyuningsih
Abstract:
The folktale to discuss is Ande-Ande Lumut, a story that is believed to be a history of two kingdoms in East Java, Indonesia. The title refers to the main male character in the story. This research is a library research which aims to know the patriarchal values in Indonesia. The data for the research is the song in the story that is actually the conversation between Ande-Ande Lumut and a mom who adopts him. It is told in the lines that many beautiful girls come to propose Ande-Ande Lumut but he does not want to accept them and keeps on staying in his upstairs room. Finally, he says yes for Klething Kuning to whom his mom describes as a girl with ugly face. Ande-Ande Lumut's decision is taken as Klething Kuning is the only girl who doesn't let Yuyu Kangkang help her. Yuyu Kangkang is described as a very big crab that helps the girls to cross the river but ask them to kiss him. Through the lense of feminist approach, Ande-Ande Lumut shows the men’s preference and dominance to make final decision over women. Even though the girls are actively propose their future husband, but they do it without giving any requirements. Meanwhile, the future husband chooses a girl with a criterion that no male has ever touched her, although the male is a crab.Keywords: future spouse, Indonesian folktale, acceptance, patriarchal
Procedia PDF Downloads 29825327 Evidence of Groundwater Reservoirs Associated with Fault Structures and Magmatic Dyke Intrusions: Insights from Geophysics and Well Data Analysis in Central Cameroon
Authors: Mbida Yem, Alessandra Ribodetti, Joseph Quentin Yéné Atangana, Fabrice Jouffray, Dieudonné Bisso
Abstract:
Central Cameroon is a mosaic complex of Proterozoïc litho-tectonic units, with structural deformations mainly inherited from Panafrican orogeny. It consists of a para-derived series with epicontinental affinity, structured as successive nappe thrusting southward on the Ntem complex, considered as the Congo Craton northern margin. A well-developed prograde metamorphic gradient is described from SE (Dja and Yokadouma meta-detritic series) to NW (gneiss and migmatites of the Yaounde series) with ages estimated at 600-620 Ma. Syn- to late phase of the Panafrican deformations crosscut the nappes structured with large mylonitic shear zones (Sanaga fault, Adamawa fault, Tcholiré-Banyo fault) coeval with intrusive granitoids. The scientific and industrial communities interested in exploring the groundwater resources of these litho-tectonic units using geophysics and geohydrology methods have grown steadily since the 1970s. In this paper, we present shallow and deep geophysical cross-sections that describe the most productive groundwater targets of the Central Cameroon litho-tectonic units. This study also discusses geological factors that control groundwater occurrences. The data analyzed were gathered from public and private groundwater surveys conducted in recent years and included 18 well-controlled resistivity sections and hydrogeological parameters of 150 drilling points. The depth of well records extends from 40 to 180m. Also, one of the challenges of geophysics investigations was to image groundwater reservoirs located above 120m depth. Therefore, the resistivity data were acquired using a 1200 m long digital streamer, with a 10 m electrode spacing in the selected sites. The modelled sections derived from these data show that the most productive groundwater targets of the study area include lithological contacts and dyke fault-zones. The average width of dyke fault-zones ranges between 40 and 380 m. These structures display a significant lateral extent, and their spatial distribution is often in correlation with mountain terranes and regional fault zones trending from SW-NE to NNW-SSE. Following these observations, transboundaries aquifers associated with fractured magmatic rocks can be found in the study area.Keywords: Proterozoic, resistivity sections, dyke fault-zones, groundwater target
Procedia PDF Downloads 1425326 Protecting Privacy and Data Security in Online Business
Authors: Bilquis Ferdousi
Abstract:
With the exponential growth of the online business, the threat to consumers’ privacy and data security has become a serious challenge. This literature review-based study focuses on a better understanding of those threats and what legislative measures have been taken to address those challenges. Research shows that people are increasingly involved in online business using different digital devices and platforms, although this practice varies based on age groups. The threat to consumers’ privacy and data security is a serious hindrance in developing trust among consumers in online businesses. There are some legislative measures taken at the federal and state level to protect consumers’ privacy and data security. The study was based on an extensive review of current literature on protecting consumers’ privacy and data security and legislative measures that have been taken.Keywords: privacy, data security, legislation, online business
Procedia PDF Downloads 11025325 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm
Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan
Abstract:
This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data
Procedia PDF Downloads 22725324 An Analysis of Privacy and Security for Internet of Things Applications
Authors: Dhananjay Singh, M. Abdullah-Al-Wadud
Abstract:
The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.Keywords: Internet of Things (IoT), message authentication, privacy, security
Procedia PDF Downloads 38625323 Effect of Al on Glancing Angle Deposition Synthesized In₂O₃ Nanocolumn for Photodetector Application
Authors: Chitralekha Ngangbam, Aniruddha Mondal, Naorem Khelchand Singh
Abstract:
Aluminium (Al) doped In2O3 (Indium Oxide) nanocolumn array was synthesized by glancing angle deposition (GLAD) technique on Si (n-type) substrate for photodetector application. The sample was characterized by scanning electron microscopy (SEM). The average diameter of the nanocolumn was calculated from the top view of the SEM image and found to be ∼80 nm. The length of the nanocolumn (~500 nm) was calculated from cross sectional SEM image and it shows that the nanocolumns are perpendicular to the substrate. The EDX analysis confirmed the presence of Al (Aluminium), In (Indium), O (Oxygen) elements in the samples. The XRD patterns of the Al-doped In2O3 nanocolumn show the presence of different phases of the Al doped In2O3 nanocolumn i.e. (222) and (622). Three different peaks were observed from the PL analysis of Al doped In2O3 nanocolumn at 365 nm, 415 nm and 435 nm respectively. The peak at PL emission at 365 nm can be attributed to the near band gap transition of In2O3 whereas the peaks at 415 nm and 435 nm can be attributed to the trap state emissions due to oxygen vacancies and oxygen–indium vacancy centre in Al doped In2O3 nanocolumn. The current-voltage (I–V) characteristics of the Al doped In2O3 nanocolumn based detector was measured through the Au Schottky contact. The devices were then examined under the halogen light (20 W) illumination for photocurrent measurement. The Al-doped In2O3 nanocolumn based optical detector showed high conductivity and low turn on voltage at 0.69 V under white light illumination. A maximum photoresponsivity of 82 A/W at 380 nm was observed for the device. The device shows a high internal gain of ~267 at UV region (380 nm) and ∼127 at visible region (760 nm). Also the rise time and fall time for the device at 650 nm is 0.15 and 0.16 sec respectively which makes it suitable for fast response detector.Keywords: glancing angle deposition, nanocolumn, semiconductor, photodetector, indium oxide
Procedia PDF Downloads 18125322 Methodology and Credibility of Unmanned Aerial Vehicle-Based Cadastral Mapping
Authors: Ajibola Isola, Shattri Mansor, Ojogbane Sani, Olugbemi Tope
Abstract:
The cadastral map is the rationale behind city management planning and development. For years, cadastral maps have been produced by ground and photogrammetry platforms. Recent evolution in photogrammetry and remote sensing sensors ignites the use of Unmanned Aerial Vehicle systems (UAVs) for cadastral mapping. Despite the time-saving and multi-dimensional cost-effectiveness of the UAV platform, issues related to cadastral map accuracy are a hindrance to the wide applicability of UAVs' cadastral mapping. This study aims to present an approach leading to the generation and assessing the credibility of UAV cadastral mapping. Different sets of Red, Green, and Blue (RGB) photos were obtained from the Tarot 680-hexacopter UAV platform flown over the Universiti Putra Malaysia campus sports complex at an altitude range of 70 m, 100 m, and 250. Before flying the UAV, twenty-eight ground control points were evenly established in the study area with a real-time kinematic differential global positioning system. The second phase of the study utilizes an image-matching algorithm for photos alignment wherein camera calibration parameters and ten of the established ground control points were used for estimating the inner, relative, and absolute orientations of the photos. The resulting orthoimages are exported to ArcGIS software for digitization. Visual, tabular, and graphical assessments of the resulting cadastral maps showed a different level of accuracy. The results of the study show a gradual approach for generating UAV cadastral mapping and that the cadastral map acquired at 70 m altitude produced better results.Keywords: aerial mapping, orthomosaic, cadastral map, flying altitude, image processing
Procedia PDF Downloads 8925321 Development of an Automatic Computational Machine Learning Pipeline to Process Confocal Fluorescence Images for Virtual Cell Generation
Authors: Miguel Contreras, David Long, Will Bachman
Abstract:
Background: Microscopy plays a central role in cell and developmental biology. In particular, fluorescence microscopy can be used to visualize specific cellular components and subsequently quantify their morphology through development of virtual-cell models for study of effects of mechanical forces on cells. However, there are challenges with these imaging experiments, which can make it difficult to quantify cell morphology: inconsistent results, time-consuming and potentially costly protocols, and limitation on number of labels due to spectral overlap. To address these challenges, the objective of this project is to develop an automatic computational machine learning pipeline to predict cellular components morphology for virtual-cell generation based on fluorescence cell membrane confocal z-stacks. Methods: Registered confocal z-stacks of nuclei and cell membrane of endothelial cells, consisting of 20 images each, were obtained from fluorescence confocal microscopy and normalized through software pipeline for each image to have a mean pixel intensity value of 0.5. An open source machine learning algorithm, originally developed to predict fluorescence labels on unlabeled transmitted light microscopy cell images, was trained using this set of normalized z-stacks on a single CPU machine. Through transfer learning, the algorithm used knowledge acquired from its previous training sessions to learn the new task. Once trained, the algorithm was used to predict morphology of nuclei using normalized cell membrane fluorescence images as input. Predictions were compared to the ground truth fluorescence nuclei images. Results: After one week of training, using one cell membrane z-stack (20 images) and corresponding nuclei label, results showed qualitatively good predictions on training set. The algorithm was able to accurately predict nuclei locations as well as shape when fed only fluorescence membrane images. Similar training sessions with improved membrane image quality, including clear lining and shape of the membrane, clearly showing the boundaries of each cell, proportionally improved nuclei predictions, reducing errors relative to ground truth. Discussion: These results show the potential of pre-trained machine learning algorithms to predict cell morphology using relatively small amounts of data and training time, eliminating the need of using multiple labels in immunofluorescence experiments. With further training, the algorithm is expected to predict different labels (e.g., focal-adhesion sites, cytoskeleton), which can be added to the automatic machine learning pipeline for direct input into Principal Component Analysis (PCA) for generation of virtual-cell mechanical models.Keywords: cell morphology prediction, computational machine learning, fluorescence microscopy, virtual-cell models
Procedia PDF Downloads 20925320 Cognitive Science Based Scheduling in Grid Environment
Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya
Abstract:
Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence
Procedia PDF Downloads 39525319 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption
Authors: Darusalam, Jorish Hulstijn, Marijn Janssen
Abstract:
Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.Keywords: open data, accountability, anti-corruption, framework
Procedia PDF Downloads 33925318 Syndromic Surveillance Framework Using Tweets Data Analytics
Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden
Abstract:
Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza
Procedia PDF Downloads 12125317 Characterization and Monitoring of the Yarn Faults Using Diametric Fault System
Authors: S. M. Ishtiaque, V. K. Yadav, S. D. Joshi, J. K. Chatterjee
Abstract:
The DIAMETRIC FAULTS system has been developed that captures a bi-directional image of yarn continuously in sequentially manner and provides the detailed classification of faults. A novel mathematical framework developed on the acquired bi-directional images forms the basis of fault classification in four broad categories, namely, Thick1, Thick2, Thin and Normal Yarn. A discretised version of Radon transformation has been used to convert the bi-directional images into one-dimensional signals. Images were divided into training and test sample sets. Karhunen–Loève Transformation (KLT) basis is computed for the signals from the images in training set for each fault class taking top six highest energy eigen vectors. The fault class of the test image is identified by taking the Euclidean distance of its signal from its projection on the KLT basis for each sample realization and fault class in the training set. Euclidean distance applied using various techniques is used for classifying an unknown fault class. An accuracy of about 90% is achieved in detecting the correct fault class using the various techniques. The four broad fault classes were further sub classified in four sub groups based on the user set boundary limits for fault length and fault volume. The fault cross-sectional area and the fault length defines the total volume of fault. A distinct distribution of faults is found in terms of their volume and physical dimensions which can be used for monitoring the yarn faults. It has been shown from the configurational based characterization and classification that the spun yarn faults arising out of mass variation, exhibit distinct characteristics in terms of their contours, sizes and shapes apart from their frequency of occurrences.Keywords: Euclidean distance, fault classification, KLT, Radon Transform
Procedia PDF Downloads 26725316 Spectroscopic Study of Tb³⁺ Doped Calcium Aluminozincate Phosphor for Display and Solid-State Lighting Applications
Authors: Sumandeep Kaur, Allam Srinivasa Rao, Mula Jayasimhadri
Abstract:
In recent years, rare earth (RE) ions doped inorganic luminescent materials are seeking great attention due to their excellent physical and chemical properties. These materials offer high thermal and chemical stability and exhibit good luminescence properties due to the presence of RE ions. The luminescent properties of these materials are attributed to their intra-configurational f-f transitions in RE ions. A series of Tb³⁺ doped calcium aluminozincate has been synthesized via sol-gel method. The structural and morphological studies have been carried out by recording X-ray diffraction patterns and SEM image. The luminescent spectra have been recorded for a comprehensive study of their luminescence properties. The XRD profile reveals the single-phase orthorhombic crystal structure with an average crystallite size of 65 nm as calculated by using DebyeScherrer equation. The SEM image exhibits completely random, irregular morphology of micron size particles of the prepared samples. The optimization of luminescence has been carried out by varying the dopant Tb³⁺ concentration within the range from 0.5 to 2.0 mol%. The as-synthesized phosphors exhibit intense emission at 544 nm pumped at 478 nm excitation wavelength. The optimized Tb³⁺ concentration has been found to be 1.0 mol% in the present host lattice. The decay curves show bi-exponential fitting for the as-synthesized phosphor. The colorimetric studies show green emission with CIE coordinates (0.334, 0.647) lying in green region for the optimized Tb³⁺ concentration. This report reveals the potential utility of Tb³⁺ doped calcium aluminozincate phosphors for display and solid-state lighting devices.Keywords: concentration quenching, phosphor, photoluminescence, XRD
Procedia PDF Downloads 15525315 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 31625314 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining
Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser
Abstract:
Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract
Procedia PDF Downloads 662