Search results for: Information Hiding.
2908 Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile
Authors: Enrico Rukzio, George N. Prezerakos, Giovanni Cortese, Eleftherios Koutsoloukas, Sofia Kapellaki
Abstract:
The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Keywords: 3GPP, context, context-awareness, context model, information model, user model, XML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87742907 A Blind SLM Scheme for Reduction of PAPR in OFDM Systems
Authors: K. Kasiri, M. J. Dehghani
Abstract:
In this paper we propose a blind algorithm for peakto- average power ratio (PAPR) reduction in OFDM systems, based on selected mapping (SLM) algorithm as a distortionless method. The main drawback of the conventional SLM technique is the need for transmission of several side information bits, for each data block, which results in loss in data rate transmission. In the proposed method some special number of carriers in the OFDM frame is reserved to be rotated with one of the possible phases according to the number of phase sequence blocks in SLM algorithm. Reserving some limited number of carriers wont effect the reduction in PAPR of OFDM signal. Simulation results show using ML criteria at the receiver will lead to the same system-performance as the conventional SLM algorithm, while there is no need to send any side information to the receiver.
Keywords: Orthogonal Frequency Division Multiplexing(OFDM), Peak-to-Average Power Ratio (PAPR), Selected Mapping(SLM), Blind SLM (BSLM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22952906 Natural Language Database Interface for Selection of Data Using Grammar and Parsing
Authors: N. D. Karande, G. A. Patil
Abstract:
Databases have become ubiquitous. Almost all IT applications are storing into and retrieving information from databases. Retrieving information from the database requires knowledge of technical languages such as Structured Query Language (SQL). However majority of the users who interact with the databases do not have a technical background and are intimidated by the idea of using languages such as SQL. This has led to the development of a few Natural Language Database Interfaces (NLDBIs). A NLDBI allows the user to query the database in a natural language. This paper highlights on architecture of new NLDBI system, its implementation and discusses on results obtained. In most of the typical NLDBI systems the natural language statement is converted into an internal representation based on the syntactic and semantic knowledge of the natural language. This representation is then converted into queries using a representation converter. A natural language query is translated to an equivalent SQL query after processing through various stages. The work has been experimented on primitive database queries with certain constraints.
Keywords: Natural language database interface, representation converter, syntactic and semantic knowledge
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27052905 3D CAD Models and its Feature Similarity
Authors: Elmi Abu Bakar, Tetsuo Miyake, Zhong Zhang, Takashi Imamura
Abstract:
Knowing the geometrical object pose of products in manufacturing line before robot manipulation is required and less time consuming for overall shape measurement. In order to perform it, the information of shape representation and matching of objects is become required. Objects are compared with its descriptor that conceptually subtracted from each other to form scalar metric. When the metric value is smaller, the object is considered closed to each other. Rotating the object from static pose in some direction introduce the change of value in scalar metric value of boundary information after feature extraction of related object. In this paper, a proposal method for indexing technique for retrieval of 3D geometrical models based on similarity between boundaries shapes in order to measure 3D CAD object pose using object shape feature matching for Computer Aided Testing (CAT) system in production line is proposed. In experimental results shows the effectiveness of proposed method.
Keywords: CAD, rendering, feature extraction, feature classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19792904 Energy Intensity of a Historical Downtown: Estimating the Energy Demand of a Budapest District
Authors: Viktória Sugár, Attila Talamon, András Horkai, Michihiro Kita
Abstract:
The dense urban fabric of the 7th district of Budapest -known as the former Jewish Quarter-, contains mainly historical style, multi-story tenement houses with courtyards. The high population density and the unsatisfactory energetic state of the buildings result high energy consumption. As a preliminary survey of a complex rehabilitation plan, the authors aim to determine the energy demand of the area. The energy demand was calculated by analyzing the structure and the energy consumption of each building by using Geographic Information System (GIS) methods. The carbon dioxide emission was also calculated, to assess the potential of reducing the present state value by complex structural and energetic rehabilitation. As a main focus of the survey, an energy intensity map has been created about the area.
Keywords: Carbon dioxide, energy intensity map, geographic information system, GIS, Hungary, Jewish quarter, rehabilitation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9852903 Degeneracy of MIS under the Conditions of Instability: A Mathematical Formulation
Authors: Nazar Younis, Raied Salman
Abstract:
It has been always observed that the effectiveness of MIS as a support tool for management decisions degenerate after time of implementation, despite the substantial investments being made. This is true for organizations at the initial stages of MIS implementations, manual or computerized. A survey of a sample of middle to top managers in business and government institutions was made. A large ratio indicates that the MIS has lost its impact on the day-to-day operations, and even the response lag time expands sometimes indefinitely. The data indicates an infant mortality phenomenon of the bathtub model. Reasons may be monotonous nature of MIS delivery, irrelevance, irreverence, timeliness, and lack of adequate detail. All those reasons collaborate to create a degree of degeneracy. We investigate and model as a bathtub model the phenomenon of MIS degeneracy that inflicts the MIS systems and renders it ineffective. A degeneracy index is developed to identify the status of the MIS system and possible remedies to prevent the onset of total collapse of the system to the point of being useless.Keywords: MIS, management theory, information technology, information systems, IS, organizational environment, organizations, degeneracy, organizational change.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15852902 SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids
Authors: Jean-Alain Grunchec, Jules Hernández-Sánchez, Sara Knott
Abstract:
Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Keywords: Grid computing, multiple simultaneous requests, fault tolerance, GridQTL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19102901 Re-Optimization MVPP Using Common Subexpression for Materialized View Selection
Authors: Boontita Suchyukorn, Raweewan Auepanwiriyakul
Abstract:
A Data Warehouses is a repository of information integrated from source data. Information stored in data warehouse is the form of materialized in order to provide the better performance for answering the queries. Deciding which appropriated views to be materialized is one of important problem. In order to achieve this requirement, the constructing search space close to optimal is a necessary task. It will provide effective result for selecting view to be materialized. In this paper we have proposed an approach to reoptimize Multiple View Processing Plan (MVPP) by using global common subexpressions. The merged queries which have query processing cost not close to optimal would be rewritten. The experiment shows that our approach can help to improve the total query processing cost of MVPP and sum of query processing cost and materialized view maintenance cost is reduced as well after views are selected to be materialized.
Keywords: Data Warehouse, materialized views, query rewriting, common subexpressions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16782900 Study on Wireless Transmission for Reconnaissance UAV with Wireless Sensor Network and Cylindrical Array of Microstrip Antennas
Authors: Chien-Chun Hung, Chun-Fong Wu
Abstract:
It is important for a commander to have real-time information to aware situations and to make decision in the battlefield. Results of modern technique developments have brought in this kind of information for military purposes. Unmanned aerial vehicle (UAV) is one of the means to gather intelligence owing to its widespread applications. It is still not clear whether or not the mini UAV with short-range wireless transmission system is used as a reconnaissance system in Taiwanese. In this paper, previous experience on the research of the sort of aerial vehicles has been applied with a data-relay system using the ZigBee modulus. The mini UAV developed is expected to be able to collect certain data in some appropriate theaters. The omni-directional antenna with high gain is also integrated into mini UAV to fit the size-reducing trend of airborne sensors. Two advantages are so far obvious. First, mini UAV can fly higher than usual to avoid being attacked from ground fires. Second, the data will be almost gathered during all maneuvering attitudes.
Keywords: Mini UAV, reconnaissance, wireless transmission, ZigBee modulus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6982899 Interstate Comparison of Environmental Performance using Stochastic Frontier Analysis: The United States Case Study
Authors: Alexander Y. Vaninsky
Abstract:
Environmental performance of the U.S. States is investigated for the period of 1990 – 2007 using Stochastic Frontier Analysis (SFA). The SFA accounts for both efficiency measure and stochastic noise affecting a frontier. The frontier is formed using indicators of GDP, energy consumption, population, and CO2 emissions. For comparability, all indicators are expressed as ratios to total. Statistical information of the Energy Information Agency of the United States is used. Obtained results reveal the bell - shaped dynamics of environmental efficiency scores. The average efficiency scores rise from 97.6% in 1990 to 99.6% in 1999, and then fall to 98.4% in 2007. The main factor is insufficient decrease in the rate of growth of CO2 emissions with regards to the growth of GDP, population and energy consumption. Data for 2008 following the research period allow for an assumption that the environmental performance of the U.S. States has improved in the last years.
Keywords: Stochastic frontier analysis, environmental performance, interstate comparisons.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17102898 The Influence of Website Quality on Customer E-Satisfaction in Low Cost Airline
Authors: Zainab bt Khalifah, Wong Chiet Bing, Noor Hazarina Hashim
Abstract:
The evolution of customer behavior in purchasing products or services through the Internet leads to airline companies engaging in the e-ticketing process in order to maintain their business. A well-designed website is vitally significant for the airline companies to provide effective communication, support, and competitive advantage. This study was conducted to identify the dimensions of website quality for low cost airline and to investigate the relationship between the website quality and customer esatisfaction at low cost airline. A total of 381 responses were conveniently collected among local passengers at Low Cost Carrier Terminal, Kuala Lumpur via questionnaire distribution. This study found that the five determinant factors of website quality for AirAsia were Information Content, Navigation, Responsiveness, Personalization, and Security and Privacy. The results of this study revealed that there is a positive relationship between the five dimensions of website quality and customer e-satisfaction, and also information content was the most significant contributor to customer e-satisfaction.
Keywords: Website Quality, Customer E-Satisfaction, Low Cost Airline.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37422897 Traumatic Ankle Pain: Adequacy of Clinical Information in X-Ray Request with Reference to the Ottawa Ankle Rule
Authors: Rania Mustafa
Abstract:
This audit was conducted at Manchester University NHS Foundation Trust, Wythenshawe Hospital Radiology and Accident and Emergency [A&E] Department to assess the appropriateness of clinical information in X-ray requests, specifically in cases of acute ankle injuries. As per the Ottawa Ankle Rules and the recommendations of National Institute for Health and Care Excellence [NICE] and the Royal College of Radiology, we aimed to evaluate the appropriateness of referrals and the thoroughness of clinical information provided by Emergency Department [ED] clinicians for ankle radiography. Our goal was to achieve 100% compliance with these guidelines. The audit involved a comprehensive analysis spanning the period from August 2022 to January 2023, encompassing patient records, radiographic orders, and clinical assessments. Data collection included patient demographics, presenting complaints, clinical assessments, adherence to Ottawa Ankle Rules criteria, and subsequent radiography orders. Here we conducted two audit cycles, involving 38 patients in the first cycle and 86 patients in the second cycle. The data were furtherly filtered to include all patients who were referred from the ED for an ankle Xray with a history of acute trauma and age of more than 18 years. The key finding was that in August 2022, 60% of cases met the Ottawa Ankle Rules criteria accurately, indicating a need for improvement in adherence. However, by January 2023, there was a notable improvement, with 95% of cases accurately meeting the criteria. This significant change reflects an increased alignment with best practices for ankle radiography referrals.
Keywords: Ankle, injuries, Ottawa Ankle Rule, X-rays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2902896 Knowledge Transfer among Cross-Functional Teams as a Continual Improvement Process
Authors: Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander
Abstract:
The culture of continuous improvement in organizations is very important as it represents a source of competitive advantage. This article discusses the transfer of knowledge between companies which formed cross-functional teams and used a dynamic model for knowledge creation as a framework. In addition, the article discusses the structure of cognitive assets in companies and the concept of "stickiness" (which is defined as an obstacle to the transfer of knowledge). The purpose of this analysis is to show that an improvement in the attitude of individual members of an organization creates opportunities, and that an exchange of information and knowledge leads to generating continuous improvements in the company as a whole. This article also discusses the importance of creating the proper conditions for sharing tacit knowledge. By narrowing gaps between people, mutual trust can be created and thus contribute to an increase in sharing. The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving organizations. Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness". When developing the transfer process on cross-functional teams (as opposed to working groups), the team acquires the flexibility and responsiveness necessary to meet objectives. These types of cross-functional teams also generate synergy due to the array of different work backgrounds of their individuals. When synergy is established, a culture of continuous improvement is created.Keywords: Knowledge transfer, continuous improvement, teamwork, cognitive assets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17002895 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines
Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma
Abstract:
Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.Keywords: Road accident, machine learning, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11292894 Characterization of Microroughness Parameters in Cu and Cu2O Nanoparticles Embedded in Carbon Film
Authors: S.Solaymani, T.Ghodselahi, N.B.Nezafat, H.Zahrabi, A.Gelali
Abstract:
The morphological parameter of a thin film surface can be characterized by power spectral density (PSD) functions which provides a better description to the topography than the RMS roughness and imparts several useful information of the surface including fractal and superstructure contributions. Through the present study Nanoparticle copper/carbon composite films were prepared by co-deposition of RF-Sputtering and RF-PECVD method from acetylene gas and copper target. Surface morphology of thin films is characterized by using atomic force microscopy (AFM). The Carbon content of our films was obtained by Rutherford Back Scattering (RBS) and it varied from .4% to 78%. The power values of power spectral density (PSD) for the AFM data were determined by the fast Fourier transform (FFT) algorithms. We investigate the effect of carbon on the roughness of thin films surface. Using such information, roughness contributions of the surface have been successfully extracted.Keywords: Atomic force microscopy, Fast Fourier transform, Power spectral density, RBS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24822893 Pulse Oximeter Concept for Vascular Occlusion Test
Authors: Fatanah M. Suhaimi, J. Geoffrey Chase, Christopher G. Pretty, Rodney Elliott, Geoffrey M. Shaw
Abstract:
Microcirculatory dysfunction is very common in sepsis and may results in organ failure and increased risk of death. Analyzing oxygen utilization can potentially assess microcirculation function of an individual. In this study, a modified pulse oximeter is used to extract information signals due to absorption of red (R) and infrared (IR) light. IR and R signal are related to the overall blood volume and reduced hemoglobin, respectively. Differences between these two signals thus represent the amount of oxygenated hemoglobin. Avascular occlusion test has been conducted on healthy individuals to validate the pulse oximeter concept. In this test, both R and IR signals rapidly changed according to the occlusion process. The pulse oximeter concept presented is capable of extracting valuable information to assess microcirculation condition. Implementing this concept on ICU patients has the potential to aid sepsis diagnosis and provide more accurate tracking of patient state and sepsis status.
Keywords: Microcirculation, sepsis, sepsis diagnosis, oxygen extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20322892 Composite Kernels for Public Emotion Recognition from Twitter
Authors: Chien-Hung Chen, Yan-Chun Hsing, Yung-Chun Chang
Abstract:
The Internet has grown into a powerful medium for information dispersion and social interaction that leads to a rapid growth of social media which allows users to easily post their emotions and perspectives regarding certain topics online. Our research aims at using natural language processing and text mining techniques to explore the public emotions expressed on Twitter by analyzing the sentiment behind tweets. In this paper, we propose a composite kernel method that integrates tree kernel with the linear kernel to simultaneously exploit both the tree representation and the distributed emotion keyword representation to analyze the syntactic and content information in tweets. The experiment results demonstrate that our method can effectively detect public emotion of tweets while outperforming the other compared methods.
Keywords: Public emotion recognition, natural language processing, composite kernel, sentiment analysis, text mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7732891 Phishing Attacks Facilitated by Open-Source Intelligence
Authors: Urva Maryam
Abstract:
Private data are more often breached by clever social engineering rather than exploiting technical vulnerabilities in the systems. Complete information security requires good data safety practices to go along with technical solutions. Hackers often begin their operation by simply sending spoofed emails or fraudulent URLs to their targets and trick them into providing sensitive information such as passwords or bank account details. This technique is called phishing. Phishing attacks can be launched on email addresses, open ports and unsecured web browsers. This study uses quantitative method of research to execute phishing experiments on the participants to test their response to the phishing emails. These experiments were run on Kali Linux distribution which came bundled with multiple open-source intelligence (OSINT) tools that were used in the study. The aim of this research is to see how successful phishing attacks can be launched using OSINT and to test the response of people to spoofed emails.
Keywords: OSINT, phishing, spear phishing, email spoofing, theHarvester, Maltego.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1892890 Place Recommendation Using Location-Based Services and Real-time Social Network Data
Authors: Kanda Runapongsa Saikaew, Patcharaporn Jiranuwattanawong, Patinya Taearak
Abstract:
Currently, there is excessively growing information about places on Facebook, which is the largest social network but such information is not explicitly organized and ranked. Therefore users cannot exploit such data to recommend places conveniently and quickly. This paper proposes a Facebook application and an Android application that recommend places based on the number of check-ins of those places, the distance of those places from the current location, the number of people who like Facebook page of those places, and the number of talking about of those places. Related Facebook data is gathered via Facebook API requests. The experimental results of the developed applications show that the applications can recommend places and rank interesting places from the most to the least. We have found that the average satisfied score of the proposed Facebook application is 4.8 out of 5. The users’ satisfaction can increase by adding the app features that support personalization in terms of interests and preferences.
Keywords: Mobile computing, location-based services, recommendation system, social network analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17802889 Satisfying and Frustrating Aspects of ICT Teaching: A Comparison Based On Self-Efficacy
Authors: Deniz Deryakulu, Sener Buyukozturk, Sirin Karadeniz, Sinan Olkun
Abstract:
The purpose of this study was to determine the most satisfying and frustrating aspects of ICT (Information and Communications Technologies) teaching in Turkish schools. Another aim was to compare these aspects based-on ICT teachers- selfefficacy. Participants were 119 ICT teachers from different geographical areas of Turkey. Participants were asked to list salient satisfying and frustrating aspects of ICT teaching, and to fill out the Self-Efficacy Scale for ICT Teachers. Results showed that the high self-efficacy teachers listed more positive and negative aspects of ICT teaching then did the low self-efficacy teachers. The satisfying aspects of ICT teaching were the dynamic nature of ICT subject, higher student interest, having opportunity to help other subject teachers, and lecturing in well-equipped labs, whereas the most frequently cited frustrating aspects of ICT teaching were ICT-related extra works of schools and colleagues, shortages of hardware and technical problems, indifferent students, insufficient teaching time, and the status of ICT subject in school curriculum. This information could be useful in redesigning ICT teachers- roles and responsibilities as well as job environment in schools.
Keywords: ICT teachers, frustrating aspects of ICT teaching, satisfying aspects of ICT teaching, teacher self-efficacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17412888 Real-Time Land Use and Land Information System in Homagama Divisional Secretariat Division
Authors: Kumara Jayapathma J. H. M. S. S., Dampegama S. D. P. J.
Abstract:
Lands are valuable & limited resource which constantly changes with the growth of the population. An efficient and good land management system is essential to avoid conflicts associated with lands. This paper aims to design the prototype model of a Mobile GIS Land use and Land Information System in real-time. Homagama Divisional Secretariat Division situated in the western province of Sri Lanka was selected as the study area. The prototype model was developed after reviewing related literature. The methodology was consisted of designing and modeling the prototype model into an application running on a mobile platform. The system architecture mainly consists of a Google mapping app for real-time updates with firebase support tools. Thereby, the method of implementation consists of front-end and back-end components. Software tools used in designing applications are Android Studio with JAVA based on GeoJSON File structure. Android Studio with JAVA in GeoJSON File Synchronize to Firebase was found to be the perfect mobile solution for continuously updating Land use and Land Information System (LIS) in real-time in the present scenario. The mobile-based land use and LIS developed in this study are multiple user applications catering to different hierarchy levels such as basic users, supervisory managers, and database administrators. The benefits of this mobile mapping application will help public sector field officers with non-GIS expertise to overcome the land use planning challenges with land use updated in real-time.
Keywords: Android, Firebase, GeoJSON, GIS, JAVA, JSON, LIS, mobile GIS, real-time, REST API.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5652887 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.
Keywords: Genetic data, Pinzgau cattle, supervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23182886 Analysis of Joint Source Channel LDPC Coding for Correlated Sources Transmission over Noisy Channels
Authors: Marwa Ben Abdessalem, Amin Zribi, Ammar Bouallègue
Abstract:
In this paper, a Joint Source Channel coding scheme based on LDPC codes is investigated. We consider two concatenated LDPC codes, one allows to compress a correlated source and the second to protect it against channel degradations. The original information can be reconstructed at the receiver by a joint decoder, where the source decoder and the channel decoder run in parallel by transferring extrinsic information. We investigate the performance of the JSC LDPC code in terms of Bit-Error Rate (BER) in the case of transmission over an Additive White Gaussian Noise (AWGN) channel, and for different source and channel rate parameters. We emphasize how JSC LDPC presents a performance tradeoff depending on the channel state and on the source correlation. We show that, the JSC LDPC is an efficient solution for a relatively low Signal-to-Noise Ratio (SNR) channel, especially with highly correlated sources. Finally, a source-channel rate optimization has to be applied to guarantee the best JSC LDPC system performance for a given channel.Keywords: AWGN channel, belief propagation, joint source channel coding, LDPC codes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9832885 COVID_ICU_BERT: A Fine-tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as physiological vital signs, images and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful to influence the judgement of clinical sentiment in ICU clinical notes. This paper presents two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of a clinical transformer model that can reliably predict clinical sentiment for notes of COVID patients in ICU. We train the model on clinical notes for COVID-19 patients, ones not previously seen by Bio_ClinicalBERT or Bio_Discharge_Summary_BERT. The model which was based on Bio_ClinicalBERT achieves higher predictive accuracy than the one based on Bio_Discharge_Summary_BERT (Acc 93.33%, AUC 0.98, and Precision 0.96). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and Precision 0.92).
Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2772884 Direct Growth Rates of the Information Model for Traffic at the Service of Sustainable Development of Tourism in Dubrovacko-Neretvanska County 2014-2020
Authors: V. Viduĉić, J. Žanić Mikuliĉić, M. Raĉić, K. Sladojević
Abstract:
The research presented in this paper has been focused on analysing the impact of traffic on the sustainable development of tourism in Croatia's Dubrovacko-Neretvanska County by the year 2020, based on the figures and trends reported in 2014 and using the relevant variables that characterise the synergy of traffic and tourism in, speaking from the geographic viewpoint, the most problematic county in the Republic of Croatia. The basic hypothesis has been confirmed through scientifically obtained research results, through the quantification of the model's variables and the direct growth rates of the designed model. On the basis of scientific insights into the sustainable development of traffic and tourism in Dubrovacko- Neretvanska County, it is possible to propose a new information model for traffic at the service of the sustainable development of tourism in the County for the period 2014-2020.
Keywords: Environment protection, hotel industry, private sector, quantification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19872883 Using Data Mining Technique for Scholarship Disbursement
Authors: J. K. Alhassan, S. A. Lawal
Abstract:
This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.Keywords: Decision tree, classification, data mining, scholarship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21582882 Digital Geomatics Trends for Production and Updating Topographic Map by Using Digital Generalization Procedures
Authors: O. Z. Jasim
Abstract:
An accuracy digital map must satisfy the users for two main requirements, first, map must be visually readable and second, all the map elements must be in a good representation. These two requirements hold especially true for map generalization which aims at simplifying the representation of cartographic data. Different scales of maps are very important for any decision in any maps with different scales such as master plan and all the infrastructures maps in civil engineering. Cartographer cannot project the data onto a piece of paper, but he has to worry about its readability. The map layout of any geodatabase is very important, this layout is help to read, analyze or extract information from the map. There are many principles and guidelines of generalization that can be find in the cartographic literature. A manual reduction method for generalization depends on experience of map maker and therefore produces incompatible results. Digital generalization, rooted from conventional cartography, has become an increasing concern in both Geographic Information System (GIS) and mapping fields. This project is intended to review the state of the art of the new technology and help to understand the needs and plans for the implementation of digital generalization capability as well as increase the knowledge of production topographic maps.
Keywords: Cartography, digital generalization, mapping, GIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12892881 Comparison of Hough Transform and Mean Shift Algorithm for Estimation of the Orientation Angle of Industrial Data Matrix Codes
Authors: Ion-Cosmin Dita, Vasile Gui, Franz Quint, Marius Otesteanu
Abstract:
In automatic manufacturing and assembling of mechanical, electrical and electronic parts one needs to reliably identify the position of components and to extract the information of these components. Data Matrix Codes (DMC) are established by these days in many areas of industrial manufacturing thanks to their concentration of information on small spaces. In today’s usually order-related industry, where increased tracing requirements prevail, they offer further advantages over other identification systems. This underlines in an impressive way the necessity of a robust code reading system for detecting DMC on the components in factories. This paper compares two methods for estimating the angle of orientation of Data Matrix Codes: one method based on the Hough Transform and the other based on the Mean Shift Algorithm. We concentrate on Data Matrix Codes in industrial environment, punched, milled, lasered or etched on different materials in arbitrary orientation.
Keywords: Industrial data matrix code, Hough transform, mean shift.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13362880 Identifying Network Subgraph-Associated Essential Genes in Molecular Networks
Authors: Efendi Zaenudin, Chien-Hung Huang, Ka-Lok Ng
Abstract:
Essential genes play an important role in the survival of an organism. It has been shown that cancer-associated essential genes are genes necessary for cancer cell proliferation, where these genes are potential therapeutic targets. Also, it was demonstrated that mutations of the cancer-associated essential genes give rise to the resistance of immunotherapy for patients with tumors. In the present study, we focus on studying the biological effects of the essential genes from a network perspective. We hypothesize that one can analyze a biological molecular network by decomposing it into both three-node and four-node digraphs (subgraphs). These network subgraphs encode the regulatory interaction information among the network’s genetic elements. In this study, the frequency of occurrence of the subgraph-associated essential genes in a molecular network was quantified by using the statistical parameter, odds ratio. Biological effects of subgraph-associated essential genes are discussed. In summary, the subgraph approach provides a systematic method for analyzing molecular networks and it can capture useful biological information for biomedical research.
Keywords: Biological molecular networks, essential genes, graph theory, network subgraphs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4952879 Confidence Intervals for the Coefficients of Variation with Bounded Parameters
Authors: Jeerapa Sappakitkamjorn, Sa-aat Niwitpong
Abstract:
In many practical applications in various areas, such as engineering, science and social science, it is known that there exist bounds on the values of unknown parameters. For example, values of some measurements for controlling machines in an industrial process, weight or height of subjects, blood pressures of patients and retirement ages of public servants. When interval estimation is considered in a situation where the parameter to be estimated is bounded, it has been argued that the classical Neyman procedure for setting confidence intervals is unsatisfactory. This is due to the fact that the information regarding the restriction is simply ignored. It is, therefore, of significant interest to construct confidence intervals for the parameters that include the additional information on parameter values being bounded to enhance the accuracy of the interval estimation. Therefore in this paper, we propose a new confidence interval for the coefficient of variance where the population mean and standard deviation are bounded. The proposed interval is evaluated in terms of coverage probability and expected length via Monte Carlo simulation.
Keywords: Bounded parameters, coefficient of variation, confidence interval, Monte Carlo simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4227