Search results for: dissemination of information
9889 Mechanical Prosthesis Controlled by Brain-Computer Interface
Authors: Tianyu Cao, KIRA (Ruizhi Zhao)
Abstract:
The purpose of our research is to study the possibility of people with physical disabilities manipulating mechanical prostheses through brain-computer interface (BCI) technology. The brain-machine interface (BCI) of the neural prosthesis records signals from neurons and uses mathematical modeling to decode them, converting desired movements into body movements. In order to improve the patient's neural control, the prosthesis is given a natural feeling. It records data from sensitive areas from the body to the prosthetic limb and encodes signals in the form of electrical stimulation to the brain. In our research, the brain-computer interface (BCI) is a bridge connecting patients’ cognition and the real world, allowing information to interact with each other. The efficient work between the two is achieved through external devices. The flow of information is controlled by BCI’s ability to record neuronal signals and decode signals, which are converted into device control. In this way, we could encode information and then send it to the brain through electrical stimulation, which has significant medical application.Keywords: biomedical engineering, brain-computer interface, prosthesis, neural control
Procedia PDF Downloads 1819888 FlexPoints: Efficient Algorithm for Detection of Electrocardiogram Characteristic Points
Authors: Daniel Bulanda, Janusz A. Starzyk, Adrian Horzyk
Abstract:
The electrocardiogram (ECG) is one of the most commonly used medical tests, essential for correct diagnosis and treatment of the patient. While ECG devices generate a huge amount of data, only a small part of them carries valuable medical information. To deal with this problem, many compression algorithms and filters have been developed over the past years. However, the rapid development of new machine learning techniques poses new challenges. To address this class of problems, we created the FlexPoints algorithm that searches for characteristic points on the ECG signal and ignores all other points that do not carry relevant medical information. The conducted experiments proved that the presented algorithm can significantly reduce the number of data points which represents ECG signal without losing valuable medical information. These sparse but essential characteristic points (flex points) can be a perfect input for some modern machine learning models, which works much better using flex points as an input instead of raw data or data compressed by many popular algorithms.Keywords: characteristic points, electrocardiogram, ECG, machine learning, signal compression
Procedia PDF Downloads 1629887 Corporate Governance, Performance, and Financial Reporting Quality of Listed Manufacturing Firms in Nigeria
Authors: Jamila Garba Audu, Shehu Usman Hassan
Abstract:
The widespread failure in the financial information quality has created the need to improve the financial information quality and to strengthen the control of managers by setting up good firms structures. Published accounting information in financial statements is required to provide various users - shareholders, employees, suppliers, creditors, financial analysts, stockbrokers and government agencies – with timely and reliable information useful for making prudent, effective and efficient decisions. The relationship between corporate governance and performance to financial reporting quality is imperative; this is because despite rapid researches in this area the findings obtained from these studies are constantly inconclusive. Data for the study were extracted from the firms’ annual reports and accounts. After running the OLS regression, a robustness test was conducted for the validity of statistical inferences; the data was empirically tested. A multiple regression was employed to test the model as a technique for data analysis. The results from the analysis revealed a negative association between all the regressors and financial reporting quality except the performance of listed manufacturing firms in Nigeria. This indicates that corporate governance plays a significant role in mitigating earnings management and improving financial reporting quality while performance does not. The study recommended among others that the composition of audit committee should be made in accordance with the provision for code of corporate governance which is not more than six (6) members with at least one (1) financial expert.Keywords: corporate governance, financial reporting quality, manufacturing firms, Nigeria, performance
Procedia PDF Downloads 2469886 The Pro-Active Public Relations of Faculty of Management Science, Suan Sunandha Rajabhat University
Authors: Kanyakorn Sujarittnetikarn, Surangkana Pipatchokchaiyo
Abstract:
The objective of this research was to study the pro-active public relations of according to the characteristic of Faculty of Management Science, Suan Sunandha Rajabhat University. The sample group for this research report was students from 4 year curriculum and continued / extended curriculum, made a random distribution proportion as follows: a group of 400 students who are working while studying and a group of non – working students. The tools used in this research were questionnaires, asking about the acknowledgement of public relations information of Faculty of Management Science in the academic year 2007. The result found that friends were the most influential in choosing the education institute. The differences of method to receive information of non-working student and working student were the entertainment magazine which was interested mostly by working students and they preferred to search the information on the website after 24:00 O’clock. However, the non-working students preferred 21:00-24:00 O’clock the most.Keywords: development guidelines systems, faculty of management science, public relation planning, proactive public relations
Procedia PDF Downloads 2889885 Hybrid Feature Selection Method for Sentiment Classification of Movie Reviews
Authors: Vishnu Goyal, Basant Agarwal
Abstract:
Sentiment analysis research provides methods for identifying the people’s opinion written in blogs, reviews, social networking websites etc. Sentiment analysis is to understand what opinion people have about any given entity, object or thing. Sentiment analysis research can be broadly categorised into three types of approaches i.e. semantic orientation, machine learning and lexicon based approaches. Feature selection methods improve the performance of the machine learning algorithms by eliminating the irrelevant features. Information gain feature selection method has been considered best method for sentiment analysis; however, it has the drawback of selection of threshold. Therefore, in this paper, we propose a hybrid feature selection methods comprising of information gain and proposed feature selection method. Initially, features are selected using Information Gain (IG) and further more noisy features are eliminated using the proposed feature selection method. Experimental results show the efficiency of the proposed feature selection methods.Keywords: feature selection, sentiment analysis, hybrid feature selection
Procedia PDF Downloads 3399884 Development of a Secured Telemedical System Using Biometric Feature
Authors: O. Iyare, A. H. Afolayan, O. T. Oluwadare, B. K. Alese
Abstract:
Access to advanced medical services has been one of the medical challenges faced by our present society especially in distant geographical locations which may be inaccessible. Then the need for telemedicine arises through which live videos of a doctor can be streamed to a patient located anywhere in the world at any time. Patients’ medical records contain very sensitive information which should not be made accessible to unauthorized people in order to protect privacy, integrity and confidentiality. This research work focuses on a more robust security measure which is biometric (fingerprint) as a form of access control to data of patients by the medical specialist/practitioner.Keywords: biometrics, telemedicine, privacy, patient information
Procedia PDF Downloads 2899883 New Types of Fitness Equipment for Seniors-Based on Beginning Movement Load Training
Authors: Chia-Chi Chen, Tai-Sheng Huang
Abstract:
Ageing society has been spread around the world. The global population is not only ageing but also declining. The structure of population has changed, which has a significant impact on both the economies and industries. Thus, how to be a healthy senior citizen to relieve the burden to the family and society will be a popular issue. Although fitness equipment manufacturing industry has been mature, the ageing population is still increasing. Therefore, this study aims to design an innovative style of fitness equipment for senior citizens, based on BMLT presented by Dr. Koyama Hirofumi. The analysis of current fitness equipment on the market and the future trend will be applied in the study. With the coming of information age, senior citizens in the future are the users of information product for sure, and the new style of fitness equipment will be combined with information technology as well. Through this study, it is believed to design an innovative style of fitness equipment for seniors and help them live heartier and happier lives.Keywords: aging society, BMLT (Beginning Movement Load Training), seniors, new style of fitness equipment
Procedia PDF Downloads 2159882 Cytology Is a Promising Tool for the Diagnosis of High-Grade Serous Ovarian Carcinoma from Ascites
Authors: Miceska Simona, Škof Erik, Frković Grazio Snježana, Jeričević Anja, Smrkolj Špela, Cvjetićanin Branko, Novaković Srdjan, Grčar Kuzmanov Biljana, Kloboves-Prevodnik Veronika
Abstract:
Objectives: High-grade serous ovarian cancer (HGSOC) is characterized by the dissemination of the tumor cells (TC) in the peritoneal cavity forming malignant ascites at the time of diagnosis or recurrence. Still, cytology itself has been underutilized as a modality for the diagnosis of HGSOC from ascites, and histological examination from the tumor tissue is yet the only validated method used. The objective of this study was to evaluate the reliability of cytology in the diagnosis of HGSOC in relation to the histopathological examination. Methods: The study included 42 patients with histologically confirmed HGSOC, accompanied by malignant ascites. To confirm the malignancy of the TC in the ascites and to define their immunophenotype, immunohistochemical reaction (IHC) of the following antigens: Calretinin, MOC, WT1, PAX8, p53, p16 & Ki-67 was evaluated on ascites cytospins and tissue blocks. For complete cytological determination of HGSOC, BRCA 1/2 gene mutation was determined from ascites, tissue block, and blood. BRCA1/2 mutation from blood was performed to define the type of mutation, somatic vs germline. Results: Among 42 patients, the immunophenotype of HGSOC from ascites was confirmed in 36 cases (86%). For more profound analysis, the patients were divided in 3 groups regarding the number of TC present in the ascites: patients with less than 10% TC, 10% TC, and more than 10% TC. From all included patients, in the group with less than 10% TC, there were 10 cases, and only 5 of them(50%) showed HGSOC phenotype; 12 cases had equally 10% of TC, and 11 cases (92%) showed HGSOC phenotype; 20 cases had more than 10% TC and all of them (100%) confirmed the HGSOC immunophenotype from ascites. Only 33 patients were eligible for further BRCA1/2 analysis. Eleven BRCA1/2 mutations were detected from thetissue block: 6 germline and 5 somatic. In 2 cases with less than 10% TC, BRCA1/2 mutation was not detected; 4 cases had 10% TC, and 2 of them (50%) confirmed the mutation; 4 cases had more than 10% TC, and all showed 100% reliability with the tumor tissue. Conclusions: Cytology is a highly reliable method for determining the immunophenotype of HGSOC and BRCA1/2 mutation if more than 10% of tumor cells are present in the ascites. This may present an additional non-invasive clinical approach for fast and effective diagnose in the future, especially in inoperable conditions or relapses.Keywords: cytology, ascites, high-grade serous ovarian cancer, immunophenotype, BRCA1/2
Procedia PDF Downloads 1889881 Cell-Cell Interactions in Diseased Conditions Revealed by Three Dimensional and Intravital Two Photon Microscope: From Visualization to Quantification
Authors: Satoshi Nishimura
Abstract:
Although much information has been garnered from the genomes of humans and mice, it remains difficult to extend that information to explain physiological and pathological phenomena. This is because the processes underlying life are by nature stochastic and fluctuate with time. Thus, we developed novel "in vivo molecular imaging" method based on single and two-photon microscopy. We visualized and analyzed many life phenomena, including common adult diseases. We integrated the knowledge obtained, and established new models that will serve as the basis for new minimally invasive therapeutic approaches.Keywords: two photon microscope, intravital visualization, thrombus, artery
Procedia PDF Downloads 3739880 A Bibliometric Analysis on Filter Bubble
Authors: Misbah Fatma, Anam Saiyeda
Abstract:
This analysis charts the introduction and expansion of research into the filter bubble phenomena over the last 10 years using a large dataset of academic publications. This bibliometric study demonstrates how interdisciplinary filter bubble research is. The identification of key authors and organizations leading the filter bubble study sheds information on collaborative networks and knowledge transfer. Relevant papers are organized based on themes including algorithmic bias, polarisation, social media, and ethical implications through a systematic examination of the literature. In order to shed light on how these patterns have changed over time, the study plots their historical history. The study also looks at how research is distributed globally, showing geographic patterns and discrepancies in scholarly output. The results of this bibliometric analysis let us fully comprehend the development and reach of filter bubble research. This study offers insights into the ongoing discussion surrounding information personalization and its implications for societal discourse, democratic participation, and the potential risks to an informed citizenry by exposing dominant themes, interdisciplinary collaborations, and geographic patterns. In order to solve the problems caused by filter bubbles and to advance a more diverse and inclusive information environment, this analysis is essential for scholars and researchers.Keywords: bibliometric analysis, social media, social networking, algorithmic personalization, self-selection, content moderation policies and limited access to information, recommender system and polarization
Procedia PDF Downloads 1189879 Usability Testing on Information Design through Single-Lens Wearable Device
Authors: Jae-Hyun Choi, Sung-Soo Bae, Sangyoung Yoon, Hong-Ku Yun, Jiyoung Kwahk
Abstract:
This study was conducted to investigate the effect of ocular dominance on recognition performance using a single-lens smart display designed for cycling. A total of 36 bicycle riders who have been cycling consistently were recruited and participated in the experiment. The participants were asked to perform tasks riding a bicycle on a stationary stand for safety reasons. Independent variables of interest include ocular dominance, bike usage, age group, and information layout. Recognition time (i.e., the time required to identify specific information measured with an eye-tracker), error rate (i.e. false answer or failure to identify the information in 5 seconds), and user preference scores were measured and statistical tests were conducted to identify significant results. Recognition time and error ratio showed significant difference by ocular dominance factor, while the preference score did not. Recognition time was faster when the single-lens see-through display on the dominant eye (average 1.12sec) than on the non-dominant eye (average 1.38sec). Error ratio of the information recognition task was significantly lower when the see-through display was worn on the dominant eye (average 4.86%) than on the non-dominant eye (average 14.04%). The interaction effect of ocular dominance and age group was significant with respect to recognition time and error ratio. The recognition time of the users in their 40s was significantly longer than the other age groups when the display was placed on the non-dominant eye, while no difference was observed on the dominant eye. Error ratio also showed the same pattern. Although no difference was observed for the main effect of ocular dominance and bike usage, the interaction effect between the two variables was significant with respect to preference score. Preference score of daily bike users was higher when the display was placed on the dominant eye, whereas participants who use bikes for leisure purposes showed the opposite preference patterns. It was found more effective and efficient to wear a see-through display on the dominant eye than on the non-dominant eye, although user preference was not affected by ocular dominance. It is recommended to wear a see-through display on the dominant eye since it is safer by helping the user recognize the presented information faster and more accurately, even if the user may not notice the difference.Keywords: eye tracking, information recognition, ocular dominance, smart headware, wearable device
Procedia PDF Downloads 2729878 Analyzing Semantic Feature Using Multiple Information Sources for Reviews Summarization
Authors: Yu Hung Chiang, Hei Chia Wang
Abstract:
Nowadays, tourism has become a part of life. Before reserving hotels, customers need some information, which the most important source is online reviews, about hotels to help them make decisions. Due to the dramatic growing of online reviews, it is impossible for tourists to read all reviews manually. Therefore, designing an automatic review analysis system, which summarizes reviews, is necessary for them. The main purpose of the system is to understand the opinion of reviews, which may be positive or negative. In other words, the system would analyze whether the customers who visited the hotel like it or not. Using sentiment analysis methods will help the system achieve the purpose. In sentiment analysis methods, the targets of opinion (here they are called the feature) should be recognized to clarify the polarity of the opinion because polarity of the opinion may be ambiguous. Hence, the study proposes an unsupervised method using Part-Of-Speech pattern and multi-lexicons sentiment analysis to summarize all reviews. We expect this method can help customers search what they want information as well as make decisions efficiently.Keywords: text mining, sentiment analysis, product feature extraction, multi-lexicons
Procedia PDF Downloads 3319877 Flood-prone Urban Area Mapping Using Machine Learning, a Case Sudy of M'sila City (Algeria)
Authors: Medjadj Tarek, Ghribi Hayet
Abstract:
This study aims to develop a flood sensitivity assessment tool using machine learning (ML) techniques and geographic information system (GIS). The importance of this study is integrating the geographic information systems (GIS) and machine learning (ML) techniques for mapping flood risks, which help decision-makers to identify the most vulnerable areas and take the necessary precautions to face this type of natural disaster. To reach this goal, we will study the case of the city of M'sila, which is among the areas most vulnerable to floods. This study drew a map of flood-prone areas based on the methodology where we have made a comparison between 3 machine learning algorithms: the xGboost model, the Random Forest algorithm and the K Nearest Neighbour algorithm. Each of them gave an accuracy respectively of 97.92 - 95 - 93.75. In the process of mapping flood-prone areas, the first model was relied upon, which gave the greatest accuracy (xGboost).Keywords: Geographic information systems (GIS), machine learning (ML), emergency mapping, flood disaster management
Procedia PDF Downloads 959876 Building Information Modeling and Its Application in the State of Kuwait
Authors: Michael Gerges, Ograbe Ahiakwo, Martin Jaeger, Ahmad Asaad
Abstract:
Recent advances of Building Information Modeling (BIM) especially in the Middle East have increased remarkably. Dubai has been taking a lead on this by making it mandatory for BIM to be adopted for all projects that involve complex architecture designs. This is because BIM is a dynamic process that assists all stakeholders in monitoring the project status throughout different project phases with great transparency. It focuses on utilizing information technology to improve collaboration among project participants during the entire life cycle of the project from the initial design, to the supply chain, resource allocation, construction and all productivity requirements. In view of this trend, the paper examines the extent of applying BIM in the State of Kuwait, by exploring practitioners’ perspectives on BIM, especially their perspectives on main barriers and main advantages. To this end structured interviews were carried out based on questionnaires and with a range of different construction professionals. The results revealed that practitioners perceive improved communication and mitigated project risks by encouraged collaboration between project participants. However, it was also observed that the full implementation of BIM in the State of Kuwait requires concerted efforts to make clients demanding BIM, counteract resistance to change among construction professionals and offer more training for design team members. This paper forms part of an on-going research effort on BIM and its application in the State of Kuwait and it is on this basis that further research on the topic is proposed.Keywords: building information modeling, BIM, construction industry, Kuwait
Procedia PDF Downloads 3789875 Flood Disaster Prevention and Mitigation in Nigeria Using Geographic Information System
Authors: Dinebari Akpee, Friday Aabe Gaage, Florence Fred Nwaigwu
Abstract:
Natural disasters like flood affect many parts of the world including developing countries like Nigeria. As a result, many human lives are lost, properties damaged and so much money is lost in infrastructure damages. These hazards and losses can be mitigated and reduced by providing reliable spatial information to the generality of the people through about flood risks through flood inundation maps. Flood inundation maps are very crucial for emergency action plans, urban planning, ecological studies and insurance rates. Nigeria experience her worst flood in her entire history this year. Many cities were submerged and completely under water due to torrential rainfall. Poor city planning, lack of effective development control among others contributes to the problem too. Geographic information system (GIS) can be used to visualize the extent of flooding, analyze flood maps to produce flood damaged estimation maps and flood risk maps. In this research, the under listed steps were taken in preparation of flood risk maps for the study area: (1) Digitization of topographic data and preparation of digital elevation model using ArcGIS (2) Flood simulation using hydraulic model and integration and (3) Integration of the first two steps to produce flood risk maps. The results shows that GIS can play crucial role in Flood disaster control and mitigation.Keywords: flood disaster, risk maps, geographic information system, hazards
Procedia PDF Downloads 2279874 An Exploratory Study to Appraise the Current Challenges and Limitations Faced in Applying and Integrating the Historic Building Information Modelling Concept for the Management of Historic Buildings
Authors: Oluwatosin Adewale
Abstract:
The sustainability of built heritage has become a relevant issue in recent years due to the social and economic values associated with these buildings. Heritage buildings provide a means for human perception of culture and represent a legacy of long-existing history; they define the local character of the social world and provide a vital connection to the past with their associated aesthetical and communal benefits. The identified values of heritage buildings have increased the importance of conservation and the lifecycle management of these buildings. The recent developments of digital design technology in engineering and the built environment have led to the adoption of Building Information Modelling (BIM) by the Architecture, Engineering, Construction, and Operations (AECO) industry. BIM provides a platform for the lifecycle management of a construction project through effective collaboration among stakeholders and the analysis of a digital information model. This growth in digital design technology has also made its way into the field of architectural heritage management in the form of Historic Building Information Modelling (HBIM). A reverse engineering process for digital documentation of heritage assets that draws upon similar information management processes as the BIM process. However, despite the several scientific and technical contributions made to the development of the HBIM process, it doesn't remain easy to integrate at the most practical level of heritage asset management. The main objective identified under the scope of the study is to review the limitations and challenges faced by heritage management professionals in adopting an HBIM-based asset management procedure for historic building projects. This paper uses an exploratory study in the form of semi-structured interviews to investigate the research problem. A purposive sample of heritage industry experts and professionals were selected to take part in a semi-structured interview to appraise some of the limitations and challenges they have faced with the integration of HBIM into their project workflows. The findings from this study will present the challenges and limitations faced in applying and integrating the HBIM concept for the management of historic buildings.Keywords: building information modelling, built heritage, heritage asset management, historic building information modelling, lifecycle management
Procedia PDF Downloads 989873 Enterprise Information Portal Features: Results of Content Analysis Literature Review
Authors: Michal Krčál
Abstract:
Since their introduction in 1990’s, Enterprise Information Portals (EIPs) were investigated from different perspectives (e.g. project management, technology acceptance, IS success). However, no systematic literature review was produced to systematize both the research efforts and the technology itself. This paper reports first results of an extent systematic literature review study focused on research of EIPs and its categorization, specifically it reports a conceptual model of EIP features. The previous attempt to categorize EIP features was published in 2002. For the purpose of the literature review, content of 89 articles was analyzed in order to identify and categorize features of EIPs. The methodology of the literature review was as follows. Firstly, search queries in major indexing databases (Web of Science and SCOPUS) were used. The results of queries were analyzed according to their usability for the goal of the study. Then, full-texts were coded in Atlas.ti according to previously established coding scheme. The codes were categorized and the conceptual model of EIP features was created.Keywords: enterprise information portal, content analysis, features, systematic literature review
Procedia PDF Downloads 2989872 Extraction of Urban Building Damage Using Spectral, Height and Corner Information
Authors: X. Wang
Abstract:
Timely and accurate information on urban building damage caused by earthquake is important basis for disaster assessment and emergency relief. Very high resolution (VHR) remotely sensed imagery containing abundant fine-scale information offers a large quantity of data for detecting and assessing urban building damage in the aftermath of earthquake disasters. However, the accuracy obtained using spectral features alone is comparatively low, since building damage, intact buildings and pavements are spectrally similar. Therefore, it is of great significance to detect urban building damage effectively using multi-source data. Considering that in general height or geometric structure of buildings change dramatically in the devastated areas, a novel multi-stage urban building damage detection method, using bi-temporal spectral, height and corner information, was proposed in this study. The pre-event height information was generated using stereo VHR images acquired from two different satellites, while the post-event height information was produced from airborne LiDAR data. The corner information was extracted from pre- and post-event panchromatic images. The proposed method can be summarized as follows. To reduce the classification errors caused by spectral similarity and errors in extracting height information, ground surface, shadows, and vegetation were first extracted using the post-event VHR image and height data and were masked out. Two different types of building damage were then extracted from the remaining areas: the height difference between pre- and post-event was used for detecting building damage showing significant height change; the difference in the density of corners between pre- and post-event was used for extracting building damage showing drastic change in geometric structure. The initial building damage result was generated by combining above two building damage results. Finally, a post-processing procedure was adopted to refine the obtained initial result. The proposed method was quantitatively evaluated and compared to two existing methods in Port au Prince, Haiti, which was heavily hit by an earthquake in January 2010, using pre-event GeoEye-1 image, pre-event WorldView-2 image, post-event QuickBird image and post-event LiDAR data. The results showed that the method proposed in this study significantly outperformed the two comparative methods in terms of urban building damage extraction accuracy. The proposed method provides a fast and reliable method to detect urban building collapse, which is also applicable to relevant applications.Keywords: building damage, corner, earthquake, height, very high resolution (VHR)
Procedia PDF Downloads 2139871 Financial Management Skills of Supreme Student Government Officers in the Schools Division of Quezon: Basis for Project Financial Literacy Information Program
Authors: Edmond Jaro Malihan
Abstract:
This study aimed to develop and propose Project Financial Literacy Information Program (FLIP) for the Schools Division of Quezon to improve the financial management skills of Supreme Student Government (SSG) officers across different school sizes. This employed a descriptive research design covering the participation of 424 selected SSG officers using purposive sampling procedures from the SDO-Quezon. The consultation was held with DepEd officials, budget officers, and financial advisors to validate the design of the self-made questionnaires in which the computed mean was verbally interpreted using the four-point Likert scale. The data gathered were presented and analyzed using weighted arithmetic mean and ANOVA test. Based on the findings, generally, SSG officers in the SDO-Quezon possess high financial management skills in terms of budget preparation, resource mobilization, and auditing and evaluation. The size of schools has no significant difference and does not contribute to the financial management skills of SSG officers, which they apply in implementing their mandated programs, projects, and activities (PPAs). The Project Financial Literacy Information Program (FLIP) was developed considering their general level of financial management skills and the launched PPAs by the organization. The project covered the suggested training program vital in conducting the Virtual Division Training on Financial Management Skills of the SSG officers.Keywords: financial management skills, SSG officers, school size, financial literacy information program
Procedia PDF Downloads 739870 Impacts of Applying Automated Vehicle Location Systems to Public Bus Transport Management
Authors: Vani Chintapally
Abstract:
The expansion of modest and minimized Global Positioning System (GPS) beneficiaries has prompted most Automatic Vehicle Location (AVL) frameworks today depending solely on satellite-based finding frameworks, as GPS is the most stable usage of these. This paper shows the attributes of a proposed framework for following and dissecting open transport in a run of the mill medium-sized city and complexities the qualities of such a framework to those of broadly useful AVL frameworks. Particular properties of the courses broke down by the AVL framework utilized for the examination of open transport in our study incorporate cyclic vehicle courses, the requirement for particular execution reports, and so forth. This paper particularly manages vehicle movement forecasts and the estimation of station landing time, combined with consequently produced reports on timetable conformance and other execution measures. Another side of the watched issue is proficient exchange of information from the vehicles to the control focus. The pervasiveness of GSM bundle information exchange advancements combined with decreased information exchange expenses have brought on today's AVL frameworks to depend predominantly on parcel information exchange administrations from portable administrators as the correspondences channel in the middle of vehicles and the control focus. This methodology brings numerous security issues up in this conceivably touchy application field.Keywords: automatic vehicle location (AVL), expectation of landing times, AVL security, data administrations, wise transport frameworks (ITS), guide coordinating
Procedia PDF Downloads 3839869 Determination of Tide Height Using Global Navigation Satellite Systems (GNSS)
Authors: Faisal Alsaaq
Abstract:
Hydrographic surveys have traditionally relied on the availability of tide information for the reduction of sounding observations to a common datum. In most cases, tide information is obtained from tide gauge observations and/or tide predictions over space and time using local, regional or global tide models. While the latter often provides a rather crude approximation, the former relies on tide gauge stations that are spatially restricted, and often have sparse and limited distribution. A more recent method that is increasingly being used is Global Navigation Satellite System (GNSS) positioning which can be utilised to monitor height variations of a vessel or buoy, thus providing information on sea level variations during the time of a hydrographic survey. However, GNSS heights obtained under the dynamic environment of a survey vessel are affected by “non-tidal” processes such as wave activity and the attitude of the vessel (roll, pitch, heave and dynamic draft). This research seeks to examine techniques that separate the tide signal from other non-tidal signals that may be contained in GNSS heights. This requires an investigation of the processes involved and their temporal, spectral and stochastic properties in order to apply suitable recovery techniques of tide information. In addition, different post-mission and near real-time GNSS positioning techniques will be investigated with focus on estimation of height at ocean. Furthermore, the study will investigate the possibility to transfer the chart datums at the location of tide gauges.Keywords: hydrography, GNSS, datum, tide gauge
Procedia PDF Downloads 2659868 Improving Fake News Detection Using K-means and Support Vector Machine Approaches
Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy
Abstract:
Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine
Procedia PDF Downloads 1769867 Perceived Influence of Information Communication Technology on Empowerment Amongst the College of Education Physical and Health Education Students in Oyo State
Authors: I. O. Oladipo, Olusegun Adewale Ajayi, Omoniyi Oladipupo Adigun
Abstract:
Information Communication Technology (ICT) have the potential to contribute to different facets of educational development and effective learning; expanding access, promoting efficiency, improve the quality of learning, enhancing the quality of teaching and provide important mechanism for the economic crisis. Considering the prevalence of unemployment among the higher institution graduates in this nation, in which much seems not to have been achieved in this direction. In view of this, the purpose of this study is to create an awareness and enlightenment of ICT for empowerment opportunities after school. A self-developed modified 4-likert scale questionnaire was used for data collection among Colleges of Education, Physical and Health Education students in Oyo State. Inferential statistical analysis of chi-square set at 0.05 alpha levels was used to analyze the stated hypotheses. The study concludes that awareness and enlightenment of ICT significantly influence empowerment opportunities and recommended that college of education students should be encouraged on the application of ICT for job opportunity after school.Keywords: employment, empowerment, information communication technology, physical education
Procedia PDF Downloads 3909866 Understanding Tacit Knowledge and DIKW
Authors: Bahadir Aydin
Abstract:
Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW
Procedia PDF Downloads 5199865 Knowledge Management Strategies within a Corporate Environment of Papers
Authors: Daniel J. Glauber
Abstract:
Knowledge transfer between personnel could benefit an organization’s improved competitive advantage in the marketplace from a strategic approach to knowledge management. The lack of information sharing between personnel could create knowledge transfer gaps while restricting the decision-making processes. Knowledge transfer between personnel can potentially improve information sharing based on an implemented knowledge management strategy. An organization’s capacity to gain more knowledge is aligned with the organization’s prior or existing captured knowledge. This case study attempted to understand the overall influence of a KMS within the corporate environment and knowledge exchange between personnel. The significance of this study was to help understand how organizations can improve the Return on Investment (ROI) of a knowledge management strategy within a knowledge-centric organization. A qualitative descriptive case study was the research design selected for this study. The lack of information sharing between personnel may create knowledge transfer gaps while restricting the decision-making processes. Developing a knowledge management strategy acceptable at all levels of the organization requires cooperation in support of a common organizational goal. Working with management and executive members to develop a protocol where knowledge transfer becomes a standard practice in multiple tiers of the organization. The knowledge transfer process could be measurable when focusing on specific elements of the organizational process, including personnel transition to help reduce time required understanding the job. The organization studied in this research acknowledged the need for improved knowledge management activities within the organization to help organize, retain, and distribute information throughout the workforce. Data produced from the study indicate three main themes including information management, organizational culture, and knowledge sharing within the workforce by the participants. These themes indicate a possible connection between an organizations KMS, the organizations culture, knowledge sharing, and knowledge transfer.Keywords: knowledge transfer, management, knowledge management strategies, organizational learning, codification
Procedia PDF Downloads 4429864 Systematic and Simple Guidance for Feed Forward Design in Model Predictive Control
Authors: Shukri Dughman, Anthony Rossiter
Abstract:
This paper builds on earlier work which demonstrated that Model Predictive Control (MPC) may give a poor choice of default feed forward compensator. By first demonstrating the impact of future information of target changes on the performance, this paper proposes a pragmatic method for identifying the amount of future information on the target that can be utilised effectively in both finite and infinite horizon algorithms. Numerical illustrations in MATLAB give evidence of the efficacy of the proposal.Keywords: model predictive control, tracking control, advance knowledge, feed forward
Procedia PDF Downloads 5479863 An Investigation on Organisation Cyber Resilience
Authors: Arniyati Ahmad, Christopher Johnson, Timothy Storer
Abstract:
Cyber exercises used to assess the preparedness of a community against cyber crises, technology failures and critical information infrastructure (CII) incidents. The cyber exercises also called cyber crisis exercise or cyber drill, involved partnerships or collaboration of public and private agencies from several sectors. This study investigates organisation cyber resilience (OCR) of participation sectors in cyber exercise called X Maya in Malaysia. This study used a principal based cyber resilience survey called C-Suite Executive checklist developed by World Economic Forum in 2012. To ensure suitability of the survey to investigate the OCR, the reliability test was conducted on C-Suite Executive checklist items. The research further investigates the differences of OCR in ten Critical National Infrastructure Information (CNII) sectors participated in the cyber exercise. The One Way ANOVA test result showed a statistically significant difference of OCR among ten CNII sectors participated in the cyber exercise.Keywords: critical information infrastructure, cyber resilience, organisation cyber resilience, reliability test
Procedia PDF Downloads 3599862 Library on the Cloud: Universalizing Libraries Based on Virtual Space
Authors: S. Vanaja, P. Panneerselvam, S. Santhanakarthikeyan
Abstract:
Cloud Computing is a latest trend in Libraries. Entering in to cloud services, Librarians can suit the present information handling and they are able to satisfy needs of the knowledge society. Libraries are now in the platform of universalizing all its information to users and they focus towards clouds which gives easiest access to data and application. Cloud computing is a highly scalable platform promising quick access to hardware and software over the internet, in addition to easy management and access by non-expert users. In this paper, we discuss the cloud’s features and its potential applications in the library and information centers, how cloud computing actually works is illustrated in this communication and how it will be implemented. It discuss about what are the needs to move to cloud, process of migration to cloud. In addition to that this paper assessed the practical problems during migration in libraries, advantages of migration process and what are the measures that Libraries should follow during migration in to cloud. This paper highlights the benefits and some concerns regarding data ownership and data security on the cloud computing.Keywords: cloud computing, cloud-service, cloud based-ILS, cloud-providers, discovery service, IaaS, PaaS, SaaS, virtualization, Web scale access
Procedia PDF Downloads 6629861 Using Implicit Data to Improve E-Learning Systems
Authors: Slah Alsaleh
Abstract:
In the recent years and with popularity of internet and technology, e-learning became a major part of majority of education systems. One of the advantages the e-learning systems provide is the large amount of information available about the students' behavior while communicating with the e-learning system. Such information is very rich and it can be used to improve the capability and efficiency of e-learning systems. This paper discusses how e-learning can benefit from implicit data in different ways including; creating homogeneous groups of student, evaluating students' learning, creating behavior profiles for students and identifying the students through their behaviors.Keywords: e-learning, implicit data, user behavior, data mining
Procedia PDF Downloads 3109860 BIASS in the Estimation of Covariance Matrices and Optimality Criteria
Authors: Juan M. Rodriguez-Diaz
Abstract:
The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix
Procedia PDF Downloads 443