Search results for: named data networking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25805

Search results for: named data networking

24365 Transformation of the Business Model in an Occupational Health Care Company Embedded in an Emerging Personal Data Ecosystem: A Case Study in Finland

Authors: Tero Huhtala, Minna Pikkarainen, Saila Saraniemi

Abstract:

Information technology has long been used as an enabler of exchange for goods and services. Services are evolving from generic to personalized, and the reverse use of customer data has been discussed in both academia and industry for the past few years. This article presents the results of an empirical case study in the area of preventive health care services. The primary data were gathered in workshops, in which future personal data-based services were conceptualized by analyzing future scenarios from a business perspective. The aim of this study is to understand business model transformation in emerging personal data ecosystems. The work was done as a case study in the context of occupational healthcare. The results have implications to theory and practice, indicating that adopting personal data management principles requires transformation of the business model, which, if successfully managed, may provide access to more resources, potential to offer better value, and additional customer channels. These advantages correlate with the broadening of the business ecosystem. Expanding the scope of this study to include more actors would improve the validity of the research. The results draw from existing literature and are based on findings from a case study and the economic properties of the healthcare industry in Finland.

Keywords: ecosystem, business model, personal data, preventive healthcare

Procedia PDF Downloads 249
24364 Psychological Variables of Sport Participation and Involvement among Student-Athletes of Tertiary Institutions in South-West, Nigeria

Authors: Mayowa Adeyeye

Abstract:

This study was conducted to investigate the psychological variables motivating sport participation and involvement among student-athletes of tertiary institutions in south-west Nigeria. One thousand three hundred and fifty (N-1350) student-athletes were randomly selected in all sports from nine tertiary institutions in south-west Nigeria. These tertiary institutions include University of Lagos, Lagos State University, Obafemi Awolowo University, Osun State University, University of Ibadan, University of Agriculture Abeokuta, Federal University of Technology Akungba, University of Ilorin, and Kwara State University. The descriptive survey research method was adopted while a self developed validated likert type questionnaire named Sport Participation Scale (SPS) was used to elicit opinion from respondents. The test-retest reliability value obtained for the instrument, using Pearson Product Moment Correlation Co-efficient was 0.96. Out of the one thousand three hundred and fifty (N-1350) questionnaire administered, only one thousand two hundred and five (N-1286) were correctly filled, coded and analysed using inferential statistics of Chi-Square (X2) while all the tested hypotheses were set at .05 alpha level. Based on the findings of this study, the result revealed that several psychological factors influence student athletes to continue participation in sport, which includes love for the game, famous athletes as role model and family support. However, the analysis further revealed that the stipends the student-athletes get from their universities have no influence on their participation and involvement in sport.

Keywords: sport participation, involvement, student-athletes, role model, family, peer

Procedia PDF Downloads 427
24363 Enhancing of Flame Retardancy and Hydrophobicity of Cotton by Coating a Phosphorous, Silica, Nitrogen Containing Bio-Flame Retardant Liquid for Upholstery Application

Authors: Li Maksym, Prabhakar M. N., Jung-Il Song

Abstract:

In this study, a flame retardant and hydrophobic cotton textile were prepared by utilizing a renewable halogen-free bio-based solution based on chitosan, urea, and phytic acid, named bio-flame retardant liquid (BFL), through facile dip-coating technology. Deposition of BFL on the surface of the cotton was confirmed by Fourier-transform infrared spectroscopy and scanning electron microscope coupled with energy-dispersive X-ray spectrometer. Thermal and flame retardant properties of the cottons were studied with thermogravimetric analysis, differential scanning calorimetry, vertical flame test, cone calorimeter test. Only with 8.8% of dry weight gain treaded cotton showed self-extinguish properties during fire test. Cone calorimeter test revealed a reduction of peak heat release rate from 203.2 to 21 kW/m2 and total heat release from 20.1 to 2.8 MJ/m2. Incidentally, BFL remarkably improved the thermal stability of flame retardant cotton from expressed in an enhanced amount of char at 700 °C (6.7 vs. 33.5%). BFL initiates the formation of phosphorous and silica contain char layer whichrestrains the propagation of heat and oxygen to unburned materialstrengthen by the liberation of non-combustible gases, which reduce the concentration of flammable volatiles and oxygen hence reducing the flammability of cotton. In addition, hydrophobicity and specific ignition test for upholstery application were performed. In conjunction, the proposed flame retardant cotton is potentially translatable to be utilized as upholstery materials in public transport.

Keywords: cotton farbic, flame retardancy, surface coating, intumescent mechanism

Procedia PDF Downloads 92
24362 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software

Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman

Abstract:

Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.

Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation

Procedia PDF Downloads 123
24361 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network

Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang

Abstract:

‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.

Keywords: deep learning network, smart metering, water end use, water-energy data

Procedia PDF Downloads 306
24360 The Term of Intellectual Property and Artificial Intelligence

Authors: Yusuf Turan

Abstract:

Definition of Intellectual Property Rights according to the World Intellectual Property Organization: " Intellectual property (IP) refers to creations of the mind, such as inventions; literary and artistic works; designs; and symbols, names and images used in commerce." It states as follows. There are 2 important points in the definition; we can say that it is the result of intellectual activities that occur by one or more than one PERSON and as INNOVATION. When the history and development of the relevant definitions are briefly examined, it is realized that these two points have remained constant and Intellectual Property law and rights have been shaped around these two points. With the expansion of the scope of the term Intellectual Property as a result of the development of technology, especially in the field of artificial intelligence, questions such as "Can "Artificial Intelligence" be an inventor?" need to be resolved within the expanding scope. In the past years, it was ruled that the artificial intelligence named DABUS seen in the USA did not meet the definition of "individual" and therefore would be an inventor/inventor. With the developing technology, it is obvious that we will encounter such situations much more frequently in the field of intellectual property. While expanding the scope, we should definitely determine the boundaries of how we should decide who performs the mental activity or creativity that we call indispensable on the inventor/inventor according to these problems. As a result of all these problems and innovative situations, it is clearly realized that not only Intellectual Property Law and Rights but also their definitions need to be updated and improved. Ignoring the situations that are outside the scope of the current Intellectual Property Term is not enough to solve the problem and brings uncertainty. The fact that laws and definitions that have been operating on the same theories for years exclude today's innovative technologies from the scope contradicts intellectual property, which is expressed as a new and innovative field. Today, as a result of the innovative creation of poetry, painting, animation, music and even theater works with artificial intelligence, it must be recognized that the definition of Intellectual Property must be revised.

Keywords: artificial intelligence, innovation, the term of intellectual property, right

Procedia PDF Downloads 70
24359 Strategy Management of Soybean (Glycine max L.) for Dealing with Extreme Climate through the Use of Cropsyst Model

Authors: Aminah Muchdar, Nuraeni, Eddy

Abstract:

The aims of the research are: (1) to verify the cropsyst plant model of experimental data in the field of soybean plants and (2) to predict planting time and potential yield soybean plant with the use of cropsyst model. This research is divided into several stages: (1) first calibration stage which conducted in the field from June until September 2015.(2) application models stage, where the data obtained from calibration in the field will be included in cropsyst models. The required data models are climate data, ground data/soil data,also crop genetic data. The relationship between the obtained result in field with simulation cropsyst model indicated by Efficiency Index (EF) which the value is 0,939.That is showing that cropsyst model is well used. From the calculation result RRMSE which the value is 1,922%.That is showing that comparative fault prediction results from simulation with result obtained in the field is 1,92%. The conclusion has obtained that the prediction of soybean planting time cropsyst based models that have been made valid for use. and the appropriate planting time for planting soybeans mainly on rain-fed land is at the end of the rainy season, in which the above study first planting time (June 2, 2015) which gives the highest production, because at that time there was still some rain. Tanggamus varieties more resistant to slow planting time cause the percentage decrease in the yield of each decade is lower than the average of all varieties.

Keywords: soybean, Cropsyst, calibration, efficiency Index, RRMSE

Procedia PDF Downloads 180
24358 Cross-Language Variation and the ‘Fused’ Zone in Bilingual Mental Lexicon: An Experimental Research

Authors: Yuliya E. Leshchenko, Tatyana S. Ostapenko

Abstract:

Language variation is a widespread linguistic phenomenon which can affect different levels of a language system: phonological, morphological, lexical, syntactic, etc. It is obvious that the scope of possible standard alternations within a particular language is limited by a variety of its norms and regulations which set more or less clear boundaries for what is possible and what is not possible for the speakers. The possibility of lexical variation (alternate usage of lexical items within the same contexts) is based on the fact that the meanings of words are not clearly and rigidly defined in the consciousness of the speakers. Therefore, lexical variation is usually connected with unstable relationship between words and their referents: a case when a particular lexical item refers to different types of referents, or when a particular referent can be named by various lexical items. We assume that the scope of lexical variation in bilingual speech is generally wider than that observed in monolingual speech due to the fact that, besides ‘lexical item – referent’ relations it involves the possibility of cross-language variation of L1 and L2 lexical items. We use the term ‘cross-language variation’ to denote a case when two equivalent words of different languages are treated by a bilingual speaker as freely interchangeable within the common linguistic context. As distinct from code-switching which is traditionally defined as the conscious use of more than one language within one communicative act, in case of cross-language lexical variation the speaker does not perceive the alternate lexical items as belonging to different languages and, therefore, does not realize the change of language code. In the paper, the authors present research of lexical variation of adult Komi-Permyak – Russian bilingual speakers. The two languages co-exist on the territory of the Komi-Permyak District in Russia (Komi-Permyak as the ethnic language and Russian as the official state language), are usually acquired from birth in natural linguistic environment and, according to the data of sociolinguistic surveys, are both identified by the speakers as coordinate mother tongues. The experimental research demonstrated that alternation of Komi-Permyak and Russian words within one utterance/phrase is highly frequent both in speech perception and production. Moreover, our participants estimated cross-language word combinations like ‘маленькая /Russian/ нывка /Komi-Permyak/’ (‘a little girl’) or ‘мунны /Komi-Permyak/ домой /Russian/’ (‘go home’) as regular/habitual, containing no violation of any linguistic rules and being equally possible in speech as the equivalent intra-language word combinations (‘учöтик нывка’ /Komi-Permyak/ or ‘идти домой’ /Russian/). All the facts considered, we claim that constant concurrent use of the two languages results in the fact that a large number of their words tend to be intuitively interpreted by the speakers as lexical variants not only related to the same referent, but also referring to both languages or, more precisely, to none of them in particular. Consequently, we can suppose that bilingual mental lexicon includes an extensive ‘fused’ zone of lexical representations that provide the basis for cross-language variation in bilingual speech.

Keywords: bilingualism, bilingual mental lexicon, code-switching, lexical variation

Procedia PDF Downloads 148
24357 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 429
24356 Prediction of Antibacterial Peptides against Propionibacterium acnes from the Peptidomes of Achatina fulica Mucus Fractions

Authors: Suwapitch Chalongkulasak, Teerasak E-Kobon, Pramote Chumnanpuen

Abstract:

Acne vulgaris is a common skin disease mainly caused by the Gram–positive pathogenic bacterium, Propionibacterium acnes. This bacterium stimulates inflammation process in human sebaceous glands. Giant African snail (Achatina fulica) is alien species that rapidly reproduces and seriously damages agricultural products in Thailand. There were several research reports on the medical and pharmaceutical benefits of this snail mucus peptides and proteins. This study aimed to in silico predict multifunctional bioactive peptides from A. fulica mucus peptidome using several bioinformatic tools for determination of antimicrobial (iAMPpred), anti–biofilm (dPABBs), cytotoxic (Toxinpred), cell membrane penetrating (CPPpred) and anti–quorum sensing (QSPpred) peptides. Three candidate peptides with the highest predictive score were selected and re-designed/modified to improve the required activities. Structural and physicochemical properties of six anti–P. acnes (APA) peptide candidates were performed by PEP–FOLD3 program and the five aforementioned tools. All candidates had random coiled structure and were named as APA1–ori, APA2–ori, APA3–ori, APA1–mod, APA2–mod and APA3–mod. To validate the APA activity, these peptide candidates were synthesized and tested against six isolates of P. acnes. The modified APA peptides showed high APA activity on some isolates. Therefore, our biomimetic mucus peptides could be useful for preventing acne vulgaris and further examined on other activities important to medical and pharmaceutical applications.

Keywords: Propionibacterium acnes, Achatina fulica, peptidomes, antibacterial peptides, snail mucus

Procedia PDF Downloads 133
24355 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning

Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul

Abstract:

In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.

Keywords: electrocardiogram, dictionary learning, sparse coding, classification

Procedia PDF Downloads 386
24354 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 251
24353 Multimedia Container for Autonomous Car

Authors: Janusz Bobulski, Mariusz Kubanek

Abstract:

The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.

Keywords: an autonomous car, image processing, lidar, obstacle detection

Procedia PDF Downloads 226
24352 Protection of Steel Bars in Reinforce Concrete with Zinc Based Coverings

Authors: Hamed Rajabzadeh Gatabi, Soroush Dastgheibifard, Mahsa Asnafi

Abstract:

There is no doubt that reinforced concrete is known as one of the most significant materials which is used in construction industry for many years. Although, some natural elements in dealing with environment can contribute to its corrosion or failure. One of which is bar or so-called reinforcement failure. So as to combat this problem, one of the oxidization prevention methods investigated was the barrier protection method implemented over the application of an organic coating, specifically fusion-bonded epoxy. In this study comparative method is prepared on two different kinds of covered bars (zinc-riches epoxy and polyamide epoxy coated bars) and also uncoated bar. With the aim of evaluate these reinforced concretes, the stickiness, toughness, thickness and corrosion performance of coatings were compared by some tools like Cu/CuSo4 electrodes, EIS and etc. Different types of concretes were exposed to the salty environment (NaCl 3.5%) and their durability was measured. As stated by the experiments in research and investigations, thick coatings (named epoxies) have acceptable stickiness and strength. Polyamide epoxy coatings stickiness to the bars was a bit better than that of zinc-rich epoxy coatings; nonetheless it was stiffer than the zinc rich epoxy coatings. Conversely, coated bars with zinc-rich epoxy showed more negative oxidization potentials, which take revenge protection of bars by zinc particles. On the whole, zinc-rich epoxy coverings is more corrosion-proof than polyamide epoxy coatings due to consuming zinc elements and some other parameters, additionally if the epoxy coatings without surface defects are applied on the rebar surface carefully, it can be said that the life of steel structures is subjected to increase dramatically.

Keywords: surface coating, epoxy polyamide, reinforce concrete bars, salty environment

Procedia PDF Downloads 289
24351 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm

Authors: Monojit Manna, Arpan Adhikary

Abstract:

In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.

Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection

Procedia PDF Downloads 77
24350 Sentiment Mapping through Social Media and Its Implications

Authors: G. C. Joshi, M. Paul, B. K. Kalita, V. Ranga, J. S. Rawat, P. S. Rawat

Abstract:

Being a habitat of the global village, every place has established connection through the strength and power of social media piercing through the political boundaries. Social media is a digital platform, where people across the world can interact as it has advantages of being universal, anonymous, easily accessible, indirect interaction, gathering and sharing information. The power of social media lies in the intensity of sharing extreme opinions or feelings, in contrast to the personal interactions which can be easily mapped in the form of Sentiment Mapping. The easy access to social networking sites such as Facebook, Twitter and blogs made unprecedented opportunities for citizens to voice their opinions loaded with dynamics of emotions. These further influence human thoughts where social media plays a very active role. A recent incident of public importance was selected as a case study to map the sentiments of people through Twitter. Understanding those dynamics through the eye of an ordinary people can be challenging. With the help of R-programming language and by the aid of GIS techniques sentiment maps has been produced. The emotions flowing worldwide in the form of tweets were extracted and analyzed. The number of tweets had diminished by 91 % from 25/08/2017 to 31/08/2017. A boom of sentiments emerged near the origin of the case, i.e., Delhi, Haryana and Punjab and the capital showed maximum influence resulting in spillover effect near Delhi. The trend of sentiments was prevailing more as neutral (45.37%), negative (28.6%) and positive (21.6%) after calculating the sentiment scores of the tweets. The result can be used to know the spatial distribution of digital penetration in India, where highest concentration lies in Mumbai and lowest in North East India and Jammu and Kashmir.

Keywords: sentiment mapping, digital literacy, GIS, R statistical language, spatio-temporal

Procedia PDF Downloads 151
24349 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: fingerprint, template protection, bio-cryptography, minutiae protection

Procedia PDF Downloads 170
24348 Improving Digital Data Security Awareness among Teacher Candidates with Digital Storytelling Technique

Authors: Veysel Çelik, Aynur Aker, Ebru Güç

Abstract:

Developments in information and communication technologies have increased both the speed of producing information and the speed of accessing new information. Accordingly, the daily lives of individuals have started to change. New concepts such as e-mail, e-government, e-school, e-signature have emerged. For this reason, prospective teachers who will be future teachers or school administrators are expected to have a high awareness of digital data security. The aim of this study is to reveal the effect of the digital storytelling technique on the data security awareness of pre-service teachers of computer and instructional technology education departments. For this purpose, participants were selected based on the principle of volunteering among third-grade students studying at the Computer and Instructional Technologies Department of the Faculty of Education at Siirt University. In the research, the pretest/posttest half experimental research model, one of the experimental research models, was used. In this framework, a 6-week lesson plan on digital data security awareness was prepared in accordance with the digital narration technique. Students in the experimental group formed groups of 3-6 people among themselves. The groups were asked to prepare short videos or animations for digital data security awareness. The completed videos were watched and evaluated together with prospective teachers during the evaluation process, which lasted approximately 2 hours. In the research, both quantitative and qualitative data collection tools were used by using the digital data security awareness scale and the semi-structured interview form consisting of open-ended questions developed by the researchers. According to the data obtained, it was seen that the digital storytelling technique was effective in creating data security awareness and creating permanent behavior changes for computer and instructional technology students.

Keywords: digital storytelling, self-regulation, digital data security, teacher candidates, self-efficacy

Procedia PDF Downloads 126
24347 A Remote Sensing Approach to Calculate Population Using Roads Network Data in Lebanon

Authors: Kamel Allaw, Jocelyne Adjizian Gerard, Makram Chehayeb, Nada Badaro Saliba

Abstract:

In developing countries, such as Lebanon, the demographic data are hardly available due to the absence of the mechanization of population system. The aim of this study is to evaluate, using only remote sensing data, the correlations between the number of population and the characteristics of roads network (length of primary roads, length of secondary roads, total length of roads, density and percentage of roads and the number of intersections). In order to find the influence of the different factors on the demographic data, we studied the degree of correlation between each factor and the number of population. The results of this study have shown a strong correlation between the number of population and the density of roads and the number of intersections.

Keywords: population, road network, statistical correlations, remote sensing

Procedia PDF Downloads 162
24346 A Multicopy Strategy for Improved Security Wireless Sensor Network

Authors: Tuğçe Yücel

Abstract:

A Wireless Sensor Network(WSN) is a collection of sensor nodes which are deployed randomly in an area for surveillance. Efficient utilization of limited battery energy of sensors for increased network lifetime as well as data security are major design objectives for WSN. Moreover secure transmission of data sensed to a base station for further processing. Producing multiple copies of data packets and sending them on different paths is one of the strategies for this purpose, which leads to redundant energy consumption and hence reduced network lifetime. In this work we develop a restricted multi-copy multipath strategy where data move through ‘frequently’ or ‘heavily’ used sensors is copied by the sensor incident to such central nodes and sent on node-disjoint paths. We develop a mixed integer programing(MIP) model and heuristic approach present some preleminary test results.

Keywords: MIP, sensor, telecommunications, WSN

Procedia PDF Downloads 510
24345 Perspectives of charitable organisations on the impact of the COVID-19 pandemic on family carers of people with profound and multiple intellectual disabilities.

Authors: Mark Linden, Trisha Forbes, Michael Brown, Lynne Marsh, Maria Truesdale, Stuart Todd, Nathan Hughes

Abstract:

Background The COVID-19 pandemic resulted in a reduction of health care services for many family carers of people with profound and multiple intellectual disabilities (PMID). Due to lack of services, family carers turned to charities for support during the pandemic. We explored the views of charity workers across the UK and Ireland who supported family carers during the COVID-19 pandemic and explored their views on effective online support programmes for family carers. Methods This was a qualitative study using online focus groups with participants (n = 24) from five charities across the UK and Ireland. Questions focused on challenges, supports, coping and resources which helped during lockdown restrictions. Focus groups were audio recorded, transcribed verbatim, and analysed through thematic analysis. Findings Four themes were identified (i) ‘mental and emotional health’, (ii) ‘they who shout the loudest’ (fighting for services), (iii) ‘lack of trust in statutory services’ and (iv) ‘creating an online support programme’. Mental and emotional health emerged as the most prominent theme and included three subthemes named as ‘isolation’, ‘fear of COVID-19’ and ‘the exhaustion of caring’. Conclusions The withdrawal of many services during the COVID-19 pandemic further isolated and placed strain on family carers. Even after the end of the pandemic family cares continue to report on the struggle to receive adequate support. There is a critical need to design services, including online support programmes, in partnership with family carers which adequately address their needs.

Keywords: intellectual disability, family carers, COVID-19, charities

Procedia PDF Downloads 74
24344 Wikipedia World: A Computerized Process for Cultural Heritage Data Dissemination

Authors: L. Rajaonarivo, M. N. Bessagnet, C. Sallaberry, A. Le Parc Lacayrelle, L. Leveque

Abstract:

TCVPYR is a European FEDER (European Regional Development Fund) project which aims to promote tourism in the French Pyrenees region by leveraging its cultural heritage. It involves scientists from various domains (geographers, historians, anthropologists, computer scientists...). This paper presents a fully automated process to publish any dataset as Wikipedia articles as well as the corresponding linked information on Wikidata and Wikimedia Commons. We validate this process on a sample of geo-referenced cultural heritage data collected by TCVPYR researchers in different regions of the Pyrenees. The main result concerns the technological prerequisites, which are now in place. Moreover, we demonstrated that we can automatically publish cultural heritage data on Wikimedia.

Keywords: cultural heritage dissemination, digital humanities, open data, Wikimedia automated publishing

Procedia PDF Downloads 127
24343 Adaptive Decision Feedback Equalizer Utilizing Fixed-Step Error Signal for Multi-Gbps Serial Links

Authors: Alaa Abdullah Altaee

Abstract:

This paper presents an adaptive decision feedback equalizer (ADFE) for multi-Gbps serial links utilizing a fix-step error signal extracted from cross-points of received data symbols. The extracted signal is generated based on violation of received data symbols with minimum detection requirements at the clock and data recovery (CDR) stage. The iterations of the adaptation process search for the optimum feedback tap coefficients to maximize the data eye-opening and minimize the adaptation convergence time. The effectiveness of the proposed architecture is validated using the simulation results of a serial link designed in an IBM 130 nm 1.2V CMOS technology. The data link with variable channel lengths is analyzed using Spectre from Cadence Design Systems with BSIM4 device models.

Keywords: adaptive DFE, CMOS equalizer, error detection, serial links, timing jitter, wire-line communication

Procedia PDF Downloads 120
24342 Data-Driven Market Segmentation in Hospitality Using Unsupervised Machine Learning

Authors: Rik van Leeuwen, Ger Koole

Abstract:

Within hospitality, marketing departments use segmentation to create tailored strategies to ensure personalized marketing. This study provides a data-driven approach by segmenting guest profiles via hierarchical clustering based on an extensive set of features. The industry requires understandable outcomes that contribute to adaptability for marketing departments to make data-driven decisions and ultimately driving profit. A marketing department specified a business question that guides the unsupervised machine learning algorithm. Features of guests change over time; therefore, there is a probability that guests transition from one segment to another. The purpose of the study is to provide steps in the process from raw data to actionable insights, which serve as a guideline for how hospitality companies can adopt an algorithmic approach.

Keywords: hierarchical cluster analysis, hospitality, market segmentation

Procedia PDF Downloads 108
24341 Online vs. in vivo Workshops in a Masters’ Degree Course in Mental Health Nursing: Students’ Views and Opinions

Authors: Evmorfia Koukia, Polyxeni Mangoulia

Abstract:

Workshops tend to be a vivid and productive way as an in vivo teaching method. Due to the pandemic, COVID-19 university courses were conducted through the internet. Method It was tried for the first time to integrate online art therapy workshops in a core course named “Special Themes of Mental Health Nursing” in a MSc Program in Mental Health. The duration of the course is 3-hours per week for 11 weeks in a single semester. The course has a main instructor, a professor of psychiatric nursing experienced in arts therapies workshops and visiting art therapists. All art therapists were given a certain topic to cover. Students were encouraged to keep a logbook that was evaluated at the end of the semester and was submitted as a part of the examination process of the course. An interview of 10 minutes was conducted with each student at the end of the course from an independent investigator (an assistant professor) Participants The students (sample) of the program were: nurses, psychologists, and social workers Results: All students who participated in the courses found that the learning process was vivid, encouraging participation and self-motivation, and there were no main differences from in vivo learning. The students identified their personal needs, and they felt a personal connection with the learning experience. The result of the personalized learning was that students discovered their strengths and weaknesses and developed skills like critical thinking. All students admitted that the workshops were the optimal way for them to comprehend the courses’ content, their capability to become therapists, as well as their obstacles and weaknesses while working with patients in mental health. Conclusion: There were no important differences between the views of students in online and in vivo teaching method of the workshops. The result has shown that workshops in mental health can contribute equally in the learning experience.

Keywords: mental health, workshops, students, nursing

Procedia PDF Downloads 209
24340 Application of Adaptive Architecture in Building Technologies: A Case Study of Neuhoff Site in Nashville, Tennessee

Authors: Shohreh Moshiri, Hossein Alimohammadi

Abstract:

Building construction has a great impact on climate change. Adaptive design strategies were developed to provide new life and purpose to old buildings and create new environments with economic benefits to meet resident needs. The role of smart material systems is undeniable in providing adaptivity of the architectural environments and their effects on creating better adaptive building environments. In this research, a case study named Neuhoff site located near Cumberland River in the Germantown neighborhood in the city of Nashville, Tennessee, was considered. This building in the early 1920s was constructed as a meat-packing facility and then served as a mixed-use space; however, New City has partnered with world-class architects to reinvent this site to be changed to mixed-use waterfront development. The future office space will be designed with LEED certification as a goal. Environmentally friendly sensitive materials and designs will offer for all adaptive reuse of the building. The smart materials and their applications, especially in the field of building technology and architecture, were emphasized in providing a renovation plan for the site. The advantages and qualities of smart material systems were targeted to explore in this research on the field of architecture. Also, this research helps to understand better the effects of smart material systems on the construction and design processes, exploration of the way to make architecture with better adaptive characteristics, plus provide optimal environmental situations for the users, which reflect on the climatic, structural, and architectural performances.

Keywords: adaptive architecture, building technology, case study, smart material systems

Procedia PDF Downloads 72
24339 Geographic Information System for Simulating Air Traffic By Applying Different Multi-Radar Positioning Techniques

Authors: Amara Rafik, Mostefa Belhadj Aissa

Abstract:

Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.

Keywords: ATM, GIS, radar data, simulation

Procedia PDF Downloads 118
24338 A Case Study on Problems Originated from Critical Path Method Application in a Governmental Construction Project

Authors: Mohammad Lemar Zalmai, Osman Hurol Turkakin, Cemil Akcay, Ekrem Manisali

Abstract:

In public construction projects, determining the contract period in the award phase is one of the most important factors. The contract period establishes the baseline for creating the cash flow curve and progress payment planning in the post-award phase. If overestimated, project duration causes losses for both the owner and the contractor. Therefore, it is essential to base construction project duration on reliable forecasting. In Turkey, schedules are usually built using the bar chart (Gantt) schedule, especially for governmental construction agencies. The usage of these schedules is limited for bidding purposes. Although the bar-chart schedule is useful in some cases, it lacks logical connections between activities; it would be harder to obtain the activities that have more effects than others on the project's total duration, especially in large complex projects. In this study, a construction schedule is prepared with Critical Path Method (CPM) that addresses the above-mentioned discrepancies. CPM is a simple and effective method that displays project time and critical paths, showing results of forward and backward calculations with considering the logic relationships between activities; it is a powerful tool for planning and managing all kinds of construction projects and is a very convenient method for the construction industry. CPM provides a much more useful and precise approach than traditional bar-chart diagrams that form the basis of construction planning and control. CPM has two main application utilities in the construction field; the first one is obtaining project duration, which is called an as-planned schedule that includes as-planned activity durations with relationships between subsequent activities. Another utility is during the project execution; each activity is tracked, and their durations are recorded in order to obtain as-built schedule, which is named as a black box of the project. The latter is more useful for delay analysis, and conflict resolutions. These features of CPM have been popular around the world. However, it has not been yet extensively used in Turkey. In this study, a real construction project is investigated as a case study; CPM-based scheduling is used for establishing both of as-built and as-planned schedules. Problems that emerged during the construction phase are identified and categorized. Subsequently, solutions are suggested. Two scenarios were considered. In the first scenario, project progress was monitored based as CPM was used to track and manage progress; this was carried out based on real-time data. In the second scenario, project progress was supposedly tracked based on the assumption that the Gantt chart was used. The S-curves of the two scenarios are plotted and interpreted. Comparing the results, possible faults of the latter scenario are highlighted, and solutions are suggested. The importance of CPM implementation has been emphasized and it has been proposed to make it mandatory for preparation of construction schedule based on CPM for public construction projects contracts.

Keywords: as-built, case-study, critical path method, Turkish government sector projects

Procedia PDF Downloads 119
24337 Exploring Gaming-Learning Interaction in MMOG Using Data Mining Methods

Authors: Meng-Tzu Cheng, Louisa Rosenheck, Chen-Yen Lin, Eric Klopfer

Abstract:

The purpose of the research is to explore some of the ways in which gameplay data can be analyzed to yield results that feedback into the learning ecosystem. Back-end data for all users as they played an MMOG, The Radix Endeavor, was collected, and this study reports the analyses on a specific genetics quest by using the data mining techniques, including the decision tree method. In the study, different reasons for quest failure between participants who eventually succeeded and who never succeeded were revealed. Regarding the in-game tools use, trait examiner was a key tool in the quest completion process. Subsequently, the results of decision tree showed that a lack of trait examiner usage can be made up with additional Punnett square uses, displaying multiple pathways to success in this quest. The methods of analysis used in this study and the resulting usage patterns indicate some useful ways that gameplay data can provide insights in two main areas. The first is for game designers to know how players are interacting with and learning from their game. The second is for players themselves as well as their teachers to get information on how they are progressing through the game, and to provide help they may need based on strategies and misconceptions identified in the data.

Keywords: MMOG, decision tree, genetics, gaming-learning interaction

Procedia PDF Downloads 357
24336 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms

Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy

Abstract:

Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.

Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu

Procedia PDF Downloads 487