Search results for: data mapping
24887 Autonomic Threat Avoidance and Self-Healing in Database Management System
Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik
Abstract:
Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.Keywords: autonomic computing, self-healing, threat avoidance, security
Procedia PDF Downloads 50424886 Information Extraction Based on Search Engine Results
Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk
Abstract:
The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.Keywords: search engines, information extraction, agent system
Procedia PDF Downloads 43024885 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography
Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya
Abstract:
In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography
Procedia PDF Downloads 29024884 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India
Authors: Anushtha Saxena
Abstract:
This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.Keywords: data monetization, e-commerce companies, regulatory framework, GDPR
Procedia PDF Downloads 12024883 Liquid Illumination: Fabricating Images of Fashion and Architecture
Authors: Sue Hershberger Yoder, Jon Yoder
Abstract:
“The appearance does not hide the essence, it reveals it; it is the essence.”—Jean-Paul Sartre, Being and Nothingness Three decades ago, transarchitect Marcos Novak developed an early form of algorithmic animation he called “liquid architecture.” In that project, digitally floating forms morphed seamlessly in cyberspace without claiming to evolve or improve. Change itself was seen as inevitable. And although some imagistic moments certainly stood out, none was hierarchically privileged over another. That project challenged longstanding assumptions about creativity and artistic genius by posing infinite parametric possibilities as inviting alternatives to traditional notions of stability, originality, and evolution. Through ephemeral processes of printing, milling, and projecting, the exhibition “Liquid Illumination” destabilizes the solid foundations of fashion and architecture. The installation is neither worn nor built in the conventional sense, but—like the sensual art forms of fashion and architecture—it is still radically embodied through the logics and techniques of design. Appearances are everything. Surface pattern and color are no longer understood as minor afterthoughts or vapid carriers of dubious content. Here, they become essential but ever-changing aspects of precisely fabricated images. Fourteen silk “colorways” (a term from the fashion industry) are framed selections from ongoing experiments with intricate pattern and complex color configurations. Whether these images are printed on fabric, milled in foam, or illuminated through projection, they explore and celebrate the untapped potentials of the surficial and superficial. Some components of individual prints appear to float in front of others through stereoscopic superimpositions; some figures appear to melt into others due to subtle changes in hue without corresponding changes in value; and some layers appear to vibrate via moiré effects that emerge from unexpected pattern and color combinations. The liturgical atmosphere of Liquid Illumination is intended to acknowledge that, like the simultaneously sacred and superficial qualities of rose windows and illuminated manuscripts, artistic and religious ideologies are also always malleable. The intellectual provocation of this paper pushes the boundaries of current thinking concerning viable applications for fashion print designs and architectural images—challenging traditional boundaries between fine art and design. The opportunistic installation of digital printing, CNC milling, and video projection mapping in a gallery that is normally reserved for fine art exhibitions raises important questions about cultural/commercial display, mass customization, digital reproduction, and the increasing prominence of surface effects (color, texture, pattern, reflection, saturation, etc.) across a range of artistic practices and design disciplines.Keywords: fashion, print design, architecture, projection mapping, image, fabrication
Procedia PDF Downloads 8824882 Experiments on Weakly-Supervised Learning on Imperfect Data
Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler
Abstract:
Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation
Procedia PDF Downloads 19924881 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security
Authors: Kenneth Harper
Abstract:
Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs
Procedia PDF Downloads 1924880 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads
Authors: Dražen Cvitanić, Biljana Maljković
Abstract:
This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency
Procedia PDF Downloads 45224879 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 27424878 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast
Authors: Ruixia Liu
Abstract:
Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI
Procedia PDF Downloads 23424877 Resource Allocation Scheme For IEEE802.16 Networks
Authors: Elmabruk Laias
Abstract:
IEEE Standard 802.16 provides QoS (Quality of Service) for the applications such as Voice over IP, video streaming and high bandwidth file transfer. With the ability of broadband wireless access of an IEEE 802.16 system, a WiMAX TDD frame contains one downlink subframe and one uplink subframe. The capacity allocated to each subframe is a system parameter that should be determined based on the expected traffic conditions. a proper resource allocation scheme for packet transmissions is imperatively needed. In this paper, we present a new resource allocation scheme, called additional bandwidth yielding (ABY), to improve transmission efficiency of an IEEE 802.16-based network. Our proposed scheme can be adopted along with the existing scheduling algorithms and the multi-priority scheme without any change. The experimental results show that by using our ABY, the packet queuing delay could be significantly improved, especially for the service flows of higher-priority classes.Keywords: IEEE 802.16, WiMAX, OFDMA, resource allocation, uplink-downlink mapping
Procedia PDF Downloads 47524876 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies
Authors: Sook Ching Yee, Angela Siew Hoong Lee
Abstract:
Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)
Procedia PDF Downloads 36224875 Big Data Analysis with Rhipe
Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim
Abstract:
Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe
Procedia PDF Downloads 49724874 Assessment of Agricultural Damage under Different Simulated Flood Conditions
Authors: M. N. Kadir, M. M. H. Oliver, T. Naher
Abstract:
The study assesses the areal extent of riverine flood in the flood-prone area of Faridpur District of Bangladesh using hydrological model and Geographic Information System (GIS). In the context of preparing the inundation map, flood frequency analysis was carried out to assess flooding for different flood magnitudes. Flood inundation maps were prepared based on DEM, and discharge at the river using Delft-3D model. LANDSAT satellite images have been used to develop a land cover map in the study area. The land cover map was used for mapping of cropland area. By incorporating the inundation maps on the land cover map, agricultural damage was assessed. Present monetary values of crop damage were collected through field survey from actual flood of the study area. Two different inundation maps were produced from the model for the year 2000 and 2016. In the year 2000, the floods began in the month of July, whereas in the case of the year 2016 is started in August. Under both cases, most of the areas were found to have been flooded in the month of September followed by flood recession. In order to prepare the land cover maps, four categories of LCs were considered viz., cropland, water body, trees, and rivers. Among the 755791 acres area of Faridpur District, the croplands were categorized to be 334,589 acres, followed by water bodies (279900 acres), trees (101930 acres) and rivers 39372 (acres). Damage assessment data revealed that 40% of the total cropland area had been affected by the flood in the year 2000, whereas only 19% area was affected by the 2016 flood. The study concluded that September is the critical month for cropland protection since the highest flood is expected at this time of the year in Faridpur. The northwestern and the southwestern part of the district was categorized as most vulnerable to flooding.Keywords: agricultural damage, Delft-3d, flood management, land cover map
Procedia PDF Downloads 10224873 Social Sustainability Quotient of Vertical Habitats
Authors: Abdullah Mohamed, Raipat Vaidehi
Abstract:
With increasing immigration to urban areas, every city is experiencing shortage of housing. Vertical habitats are the only solution to this problem, it is hence important to make sure that these habitats are environmentally, socially and economically sustainable. A lot of work on vertical habitats has already been carried out in terms of environmental and economic sustainability, hence this research aims to study the aspects of social sustainability of the vertical habitats. It being the least studied topic, opens many reals of novelty and uniqueness. In this Research, user perception survey and various mapping methods have been used to study the social sustainability of the existing vertical habitats in the selected cities. The various aspects that can be used to define social sustainability of any place include; safety, equity, accessibility, legibility, imagibility, readability, memorability and ease of movement. This research would help to evolve new strategies in form of design and/or guidelines to make the existing vertical habitats socially sustainable.Keywords: user lifestyle, user perception, social sustainability, vertical habitats
Procedia PDF Downloads 30924872 Security in Resource Constraints Network Light Weight Encryption for Z-MAC
Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy
Abstract:
Wireless sensor network was formed by a combination of nodes, systematically it transmitting the data to their base stations, this transmission data can be easily compromised if the limited processing power and the data consistency from these nodes are kept in mind; there is always a discussion to address the secure data transfer or transmission in actual time. This will present a mechanism to securely transmit the data over a chain of sensor nodes without compromising the throughput of the network by utilizing available battery resources available in the sensor node. Our methodology takes many different advantages of Z-MAC protocol for its efficiency, and it provides a unique key by sharing the mechanism using neighbor node MAC address. We present a light weighted data integrity layer which is embedded in the Z-MAC protocol to prove that our protocol performs well than Z-MAC when we introduce the different attack scenarios.Keywords: hybrid MAC protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node dataprocessing, Z-MAC
Procedia PDF Downloads 14424871 Survival Data with Incomplete Missing Categorical Covariates
Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar
Abstract:
The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution
Procedia PDF Downloads 40624870 A Study of Blockchain Oracles
Authors: Abdeljalil Beniiche
Abstract:
The limitation with smart contracts is that they cannot access external data that might be required to control the execution of business logic. Oracles can be used to provide external data to smart contracts. An oracle is an interface that delivers data from external data outside the blockchain to a smart contract to consume. Oracle can deliver different types of data depending on the industry and requirements. In this paper, we study and describe the widely used blockchain oracles. Then, we elaborate on his potential role, technical architecture, and design patterns. Finally, we discuss the human oracle and its key role in solving the truth problem by reaching a consensus about a certain inquiry and tasks.Keywords: blockchain, oracles, oracles design, human oracles
Procedia PDF Downloads 13624869 Groundwater Recharge Suitability Mapping Using Analytical Hierarchy Process Based-Approach
Authors: Aziza Barrek, Mohamed Haythem Msaddek, Ismail Chenini
Abstract:
Excessive groundwater pumping due to the increasing water demand, especially in the agricultural sector, causes groundwater scarcity. Groundwater recharge is the most important process that contributes to the water's durability. This paper is based on the Analytic Hierarchy Process multicriteria analysis to establish a groundwater recharge susceptibility map. To delineate aquifer suitability for groundwater recharge, eight parameters were used: soil type, land cover, drainage density, lithology, NDVI, slope, transmissivity, and rainfall. The impact of each factor was weighted. This method was applied to the El Fahs plain shallow aquifer. Results suggest that 37% of the aquifer area has very good and good recharge suitability. The results have been validated by the Receiver Operating Characteristics curve. The accuracy of the prediction obtained was 89.3%.Keywords: AHP, El Fahs aquifer, empirical formula, groundwater recharge zone, remote sensing, semi-arid region
Procedia PDF Downloads 12124868 Application of Lean Manufacturing in Brake Shoe Manufacturing Plant: A Case Study
Authors: Anees K. Ahamed, Aakash Kumar R. G., Raj M. Mohan
Abstract:
The main objective is to apply lean tools to identify and eliminate waste in and among the work stations so as to improve the process speed and quality. From the top seven wastes in the lean concept, we consider the movement of materials, defects, and inventory for the improvement since these cause the major impact on the performance measures. The layout was improved to reduce the movement of materials. It also quantifies the reduction in movement among the work stations. Value stream mapping has been used for identification of waste. Cause and effect diagram and 5W analysis are used to identify the reasons for defects and to provide the counter measures. Some cycle time reduction techniques also proposed to improve the productivity. Lean Audit check sheet was also used to identify the current position of the industry and to identify the gap to make the industry Lean.Keywords: cause and effect diagram, cycle time reduction, defects, lean, waste reduction
Procedia PDF Downloads 38524867 Multi Data Management Systems in a Cluster Randomized Trial in Poor Resource Setting: The Pneumococcal Vaccine Schedules Trial
Authors: Abdoullah Nyassi, Golam Sarwar, Sarra Baldeh, Mamadou S. K. Jallow, Bai Lamin Dondeh, Isaac Osei, Grant A. Mackenzie
Abstract:
A randomized controlled trial is the "gold standard" for evaluating the efficacy of an intervention. Large-scale, cluster-randomized trials are expensive and difficult to conduct, though. To guarantee the validity and generalizability of findings, high-quality, dependable, and accurate data management systems are necessary. Robust data management systems are crucial for optimizing and validating the quality, accuracy, and dependability of trial data. Regarding the difficulties of data gathering in clinical trials in low-resource areas, there is a scarcity of literature on this subject, which may raise concerns. Effective data management systems and implementation goals should be part of trial procedures. Publicizing the creative clinical data management techniques used in clinical trials should boost public confidence in the study's conclusions and encourage further replication. In the ongoing pneumococcal vaccine schedule study in rural Gambia, this report details the development and deployment of multi-data management systems and methodologies. We implemented six different data management, synchronization, and reporting systems using Microsoft Access, RedCap, SQL, Visual Basic, Ruby, and ASP.NET. Additionally, data synchronization tools were developed to integrate data from these systems into the central server for reporting systems. Clinician, lab, and field data validation systems and methodologies are the main topics of this report. Our process development efforts across all domains were driven by the complexity of research project data collected in real-time data, online reporting, data synchronization, and ways for cleaning and verifying data. Consequently, we effectively used multi-data management systems, demonstrating the value of creative approaches in enhancing the consistency, accuracy, and reporting of trial data in a poor resource setting.Keywords: data management, data collection, data cleaning, cluster-randomized trial
Procedia PDF Downloads 2724866 Engaging Students in Spatial Thinking through Design Education: Case Study of a Biomimicry Design Project in the Primary Classroom
Authors: Caiwei Zhu, Remke Klapwijk
Abstract:
Spatial thinking, a way of thinking based on the understanding and reasoning of spatial concepts and representations, is embedded in science, technology, engineering, arts, and mathematics (STEAM) learning. Aside from many studies that successfully used targeted training to improve students’ spatial thinking skills, few have closely examined how spatial thinking can be trained in classroom settings. Design and technology education, which receives increasing attention towards its integration into formal curriculums, inherently encompasses a wide range of spatial activities, such as constructing mental representations of design ideas, mentally transforming objects and materials to form designs, visually communicating design plans through annotated drawings, and creating 2D and 3D design artifacts. Among different design topics, biomimicry offers a unique avenue for students to recognize and analyze the shapes and structures in nature. By mapping the forms of plants and animals onto functions, students gain inspiration to solve human design challenges. This study is one of the first to highlight opportunities for training spatial thinking in a biomimicry design project for primary school students. Embracing methodological principles of educational design-based research, this case study is conducted along with iterations in the design of the intervention and collaboration with teachers. Data are harvested from small groups of 10- to 12-year-olds at an international school in the Netherlands. Classroom videos, semi-structured interviews with students, design drawings and artifacts, formative assessment, and the pre- and post-intervention spatial test triangulate evidence for students' spatial thinking. In addition to contributing to a theory of integrating spatial thinking in the primary curriculum, mechanisms underlying such improvement in spatial thinking are explored and discussed.Keywords: biomimicry, design and technology education, primary education, spatial thinking
Procedia PDF Downloads 7624865 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering
Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining
Abstract:
DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)
Procedia PDF Downloads 27924864 An Efficient Traceability Mechanism in the Audited Cloud Data Storage
Authors: Ramya P, Lino Abraham Varghese, S. Bose
Abstract:
By cloud storage services, the data can be stored in the cloud, and can be shared across multiple users. Due to the unexpected hardware/software failures and human errors, which make the data stored in the cloud be lost or corrupted easily it affected the integrity of data in cloud. Some mechanisms have been designed to allow both data owners and public verifiers to efficiently audit cloud data integrity without retrieving the entire data from the cloud server. But public auditing on the integrity of shared data with the existing mechanisms will unavoidably reveal confidential information such as identity of the person, to public verifiers. Here a privacy-preserving mechanism is proposed to support public auditing on shared data stored in the cloud. It uses group signatures to compute verification metadata needed to audit the correctness of shared data. The identity of the signer on each block in shared data is kept confidential from public verifiers, who are easily verifying shared data integrity without retrieving the entire file. But on demand, the signer of the each block is reveal to the owner alone. Group private key is generated once by the owner in the static group, where as in the dynamic group, the group private key is change when the users revoke from the group. When the users leave from the group the already signed blocks are resigned by cloud service provider instead of owner is efficiently handled by efficient proxy re-signature scheme.Keywords: data integrity, dynamic group, group signature, public auditing
Procedia PDF Downloads 39224863 Land Use Changes and Its Implications on Livelihood Activities in Msaranga Peri-Urban Settlement in Moshi Municipality, Tanzania
Authors: Magigi Wakuru, Gaudensi Kapinga
Abstract:
This study examines land use changes and its implications on livelihood activities of peri-urban settlements in Msaranga, Moshi Municipality. Specifically; it analyses the historical development of the settlement, socioeconomic characteristics and land use changes over time. Likely, find out existing livelihood activities and how have been changing over time in the context of urbanization, and lastly highlights land use change implications on livelihood activities to residents. Interviews, observations, documentary reviews and mapping were data collection tools employed. The study shows that housing, urban agriculture, roads infrastructure, recreational, open spaces and institutions are some land use types existing in the settlement. On-farm and off-farm livelihood activities have been identified livelihood activities in the settlement. These include crop cultivation, livestock keeping, trading and formal employment and have been changing over time. However, urbanisation observed to be a catalyst of change and affect livelihood activities over time. Resorting to off-farm livelihoods activities including engaging in retail business and seeking employment in formal and informal sector are some copying strategies documented. The study wind up by pointing roles of different actors and issues of particular attention to different stakeholders towards reducing impact of land use changes on livelihood strategies in the settlement. Likely, unresolved issues for future research and policy development agenda are highlighted in this study. The study concludes that the impact of land use changes on livelihood activities need collaborative effort of different stakeholders, policy enforcement as well as public private partnership in issues based implementation in cities like Moshi where land use is rapidly changing over time within urban planning cycles due to increasing population demand in cities of Sub-Saharan Africa.Keywords: land use, land use changes, livelihood activities, peri-urban settlement, Moshi, Tanzania
Procedia PDF Downloads 32524862 Securing Health Monitoring in Internet of Things with Blockchain-Based Proxy Re-Encryption
Authors: Jerlin George, R. Chitra
Abstract:
The devices with sensors that can monitor your temperature, heart rate, and other vital signs and link to the internet, known as the Internet of Things (IoT), have completely transformed the way we control health. Providing real-time health data, these sensors improve diagnostics and treatment outcomes. Security and privacy matters when IoT comes into play in healthcare. Cyberattacks on centralized database systems are also a problem. To solve these challenges, the study uses blockchain technology coupled with proxy re-encryption to secure health data. ThingSpeak IoT cloud analyzes the collected data and turns them into blockchain transactions which are safely kept on the DriveHQ cloud. Transparency and data integrity are ensured by blockchain, and secure data sharing among authorized users is made possible by proxy re-encryption. This results in a health monitoring system that preserves the accuracy and confidentiality of data while reducing the safety risks of IoT-driven healthcare applications.Keywords: internet of things, healthcare, sensors, electronic health records, blockchain, proxy re-encryption, data privacy, data security
Procedia PDF Downloads 1824861 A Socio-Spatial Analysis of Financialization and the Formation of Oligopolies in Brazilian Basic Education
Authors: Gleyce Assis Da Silva Barbosa
Abstract:
In recent years, we have witnessed a vertiginous growth of large education companies. Daughters of national and world capital, these companies expand both through consolidated physical networks in the form of branches spread across the territory and through institutional networks such as business networks through mergers, acquisitions, creation of new companies and influence. They do this by incorporating small, medium and large schools and universities, teaching systems and other products and services. They are also able to weave their webs directly or indirectly in philanthropic circles, limited partnerships, family businesses and even in public education through various mechanisms of outsourcing, privatization and commercialization of products for the sector. Although the growth of these groups in basic education seems to us a recent phenomenon in peripheral countries such as Brazil, its diffusion is closely linked to higher education conglomerates and other sectors of the economy forming oligopolies, which began to expand in the 1990s with strong state support and through political reforms that redefined its role, transforming it into a fundamental agent in the formation of guidelines to boost the incorporation of neoliberal logic. This expansion occurred through the objectification of education, commodifying it and transforming students into consumer clients. Financial power combined with the neo-liberalization of state public policies allowed the profusion of social exclusion, the increase of individuals without access to basic services, deindustrialization, automation, capital volatility and the indetermination of the economy; in addition, this process causes capital to be valued and devalued at rates never seen before, which together generates various impacts such as the precariousness of work. Understanding the connection between these processes, which engender the economy, allows us to see their consequences in labor relations and in the territory. In this sense, it is necessary to analyze the geographic-economic context and the role of the facilitating agents of this process, which can give us clues about the ongoing transformations and the directions of education in the national and even international scenario since this process is linked to the multiple scales of financial globalization. Therefore, the present research has the general objective of analyzing the socio-spatial impacts of financialization and the formation of oligopolies in Brazilian basic education. For this, the survey of laws, data, and public policies on the subject in question was used as a methodology. As a methodology, the work was based on some data from these companies available on websites for investors. Survey of information from global and national companies that operate in Brazilian basic education. In addition to mapping the expansion of educational oligopolies using public data on the location of schools. With this, the research intends to provide information about the ongoing commodification process in the country. Discuss the consequences of the oligopolization of education, considering the impacts that financialization can bring to teaching work.Keywords: financialization, oligopolies, education, Brazil
Procedia PDF Downloads 6424860 Rodriguez Diego, Del Valle Martin, Hargreaves Matias, Riveros Jose Luis
Authors: Nathainail Bashir, Neil Anderson
Abstract:
The objective of this study site was to investigate the current state of the practice with regards to karst detection methods and recommend the best method and pattern of arrays to acquire the desire results. Proper site investigation in karst prone regions is extremely valuable in determining the location of possible voids. Two geophysical techniques were employed: multichannel analysis of surface waves (MASW) and electric resistivity tomography (ERT).The MASW data was acquired at each test location using different array lengths and different array orientations (to increase the probability of getting interpretable data in karst terrain). The ERT data were acquired using a dipole-dipole array consisting of 168 electrodes. The MASW data was interpreted (re: estimated depth to physical top of rock) and used to constrain and verify the interpretation of the ERT data. The ERT data indicates poorer quality MASW data were acquired in areas where there was significant local variation in the depth to top of rock.Keywords: dipole-dipole, ERT, Karst terrains, MASW
Procedia PDF Downloads 31524859 Data Science in Military Decision-Making: A Semi-Systematic Literature Review
Authors: H. W. Meerveld, R. H. A. Lindelauf
Abstract:
In contemporary warfare, data science is crucial for the military in achieving information superiority. Yet, to the authors’ knowledge, no extensive literature survey on data science in military decision-making has been conducted so far. In this study, 156 peer-reviewed articles were analysed through an integrative, semi-systematic literature review to gain an overview of the topic. The study examined to what extent literature is focussed on the opportunities or risks of data science in military decision-making, differentiated per level of war (i.e. strategic, operational, and tactical level). A relatively large focus on the risks of data science was observed in social science literature, implying that political and military policymakers are disproportionally influenced by a pessimistic view on the application of data science in the military domain. The perceived risks of data science are, however, hardly addressed in formal science literature. This means that the concerns on the military application of data science are not addressed to the audience that can actually develop and enhance data science models and algorithms. Cross-disciplinary research on both the opportunities and risks of military data science can address the observed research gaps. Considering the levels of war, relatively low attention for the operational level compared to the other two levels was observed, suggesting a research gap with reference to military operational data science. Opportunities for military data science mostly arise at the tactical level. On the contrary, studies examining strategic issues mostly emphasise the risks of military data science. Consequently, domain-specific requirements for military strategic data science applications are hardly expressed. Lacking such applications may ultimately lead to a suboptimal strategic decision in today’s warfare.Keywords: data science, decision-making, information superiority, literature review, military
Procedia PDF Downloads 16824858 Studying Methodological Maps on the Engineering Education Program
Authors: Elsaed Elsaed
Abstract:
With the constant progress in our daily lives through information and communication technology and the presence of abundant in research activities in the hardware and software associated with them, and develop and improve their performance, but still there is a need to provide all combined solutions in one business. A systematic mapping study was conducted to investigate the contributions that have been prepared, and the areas of knowledge that are explored further, and any aspects of the research used to divide the common understanding of the latest technology in software engineering education. Which, we have categorized into a well-defined engineering framework. An overview of current research topics and trends and their distribution by type of research and scope of application. In addition, the topics were grouped into groups and a list of proposed methods and frameworks and tools was used. The map shows that the current research impact is limited to a few areas of knowledge are needed to map a future path to fill the gaps in the instruction activities.Keywords: methodological maps, engineering education program, literature survey, communication technology
Procedia PDF Downloads 140