Search results for: Information Quality
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6438

Search results for: Information Quality

4758 Building Information Modeling and Its Application in the State of Kuwait

Authors: Michael Gerges, Ograbe Ahiakwo, Martin Jaeger, Ahmad Asaad

Abstract:

Recent advances of Building Information Modeling (BIM) especially in the Middle East have increased remarkably. Dubai has been taking a lead on this by making it mandatory for BIM to be adopted for all projects that involve complex architecture designs. This is because BIM is a dynamic process that assists all stakeholders in monitoring the project status throughout different project phases with great transparency. It focuses on utilizing information technology to improve collaboration among project participants during the entire life cycle of the project from the initial design, to the supply chain, resource allocation, construction and all productivity requirements. In view of this trend, the paper examines the extent of applying BIM in the State of Kuwait, by exploring practitioners’ perspectives on BIM, especially their perspectives on main barriers and main advantages. To this end structured interviews were carried out based on questionnaires and with a range of different construction professionals. The results revealed that practitioners perceive improved communication and mitigated project risks by encouraged collaboration between project participants. However, it was also observed that the full implementation of BIM in the State of Kuwait requires concerted efforts to make clients demanding BIM, counteract resistance to change among construction professionals and offer more training for design team members. This paper forms part of an on-going research effort on BIM and its application in the State of Kuwait and it is on this basis that further research on the topic is proposed.

Keywords: Building Information Modeling, BIM, construction industry, Kuwait.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2953
4757 Effects of Increased Green Surface on a Densely Built Urban Fabric: The Case of Budapest

Authors: Viktória Sugár, Orsolya Frick, Gabriella Horváth, A. Bendegúz Vöröss, Péter Leczovics, Géza Baráth

Abstract:

Urban greenery has multiple positive effects both on the city and its residents. Apart from the visual advantages, it changes the micro-climate by cooling and shading, also increasing vapor and oxygen, reducing dust and carbon-dioxide content at the same time. The above are all critical factors of livability of an urban fabric. Unfortunately, in a dense, historical district there are restricted possibilities to build green surfaces. The present study collects and systemizes the applicable green solutions in the case of a historical downtown district of Budapest. The study contains a GIS-based measurement of the eligible surfaces for greenery, and also calculates the potential of oxygen production, carbon-dioxide reduction and cooling effect of an increased green surface.  It can be concluded that increasing the green surface has measurable effects on a densely built urban fabric, including air quality, micro-climate and other environmental factors.

Keywords: Urban greenery, green roof, green wall, green surface potential, sustainable city, oxygen production, carbon-dioxide reduction, geographical information system, GIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 926
4756 Development of a Secured Telemedical System Using Biometric Feature

Authors: O. Iyare, A. H. Afolayan, O. T. Oluwadare, B. K. Alese

Abstract:

Access to advanced medical services has been one of the medical challenges faced by our present society especially in distant geographical locations which may be inaccessible. Then the need for telemedicine arises through which live videos of a doctor can be streamed to a patient located anywhere in the world at any time. Patients’ medical records contain very sensitive information which should not be made accessible to unauthorized people in order to protect privacy, integrity and confidentiality. This research work focuses on a more robust security measure which is biometric (fingerprint) as a form of access control to data of patients by the medical specialist/practitioner.

Keywords: Biometrics, telemedicine, privacy, patient information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
4755 A Method of Representing Knowledge of Toolkits in a Pervasive Toolroom Maintenance System

Authors: A. Mohamed Mydeen, Pallapa Venkataram

Abstract:

The learning process needs to be so pervasive to impart the quality in acquiring the knowledge about a subject by making use of the advancement in the field of information and communication systems. However, pervasive learning paradigms designed so far are system automation types and they lack in factual pervasive realm. Providing factual pervasive realm requires subtle ways of teaching and learning with system intelligence. Augmentation of intelligence with pervasive learning necessitates the most efficient way of representing knowledge for the system in order to give the right learning material to the learner. This paper presents a method of representing knowledge for Pervasive Toolroom Maintenance System (PTMS) in which a learner acquires sublime knowledge about the various kinds of tools kept in the toolroom and also helps for effective maintenance of the toolroom. First, we explicate the generic model of knowledge representation for PTMS. Second, we expound the knowledge representation for specific cases of toolkits in PTMS. We have also presented the conceptual view of knowledge representation using ontology for both generic and specific cases. Third, we have devised the relations for pervasive knowledge in PTMS. Finally, events are identified in PTMS which are then linked with pervasive data of toolkits based on relation formulated. The experimental environment and case studies show the accuracy and efficient knowledge representation of toolkits in PTMS.

Keywords: Generic knowledge representation, toolkit, toolroom, pervasive computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2029
4754 Dual Mode Navigation for Two-Wheeled Robot

Authors: N.M Abdul Ghani, L.K. Haur, T.P.Yon, F Naim

Abstract:

This project relates to a two-wheeled self balancing robot for transferring loads on different locations along a path. This robot specifically functions as a dual mode navigation to navigate efficiently along a desired path. First, as a plurality of distance sensors mounted at both sides of the body for collecting information on tilt angle of the body and second, as a plurality of speed sensors mounted at the bottom of the body for collecting information of the velocity of the body in relative to the ground. A microcontroller for processing information collected from the sensors and configured to set the path and to balance the body automatically while a processor operatively coupled to the microcontroller and configured to compute change of the tilt and velocity of the body. A direct current motor operatively coupled to the microcontroller for controlling the wheels and characterized in that a remote control is operatively coupled to the microcontroller to operate the robot in dual navigation modes.

Keywords: Two-Wheeled Balancing Robot, Dual Mode Navigation, Remote Control, Desired Path.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205
4753 An Analysis of the Optimization Condition of Plasma Generator for Air Conditioner System

Authors: Arunrungrusmi S, Chaokamnerd W , Tanitteerapan T , Mungkung N., Yuji T.

Abstract:

This research aimed to develop plasma system used in air conditioners. This developed plasma system could be installed in the air conditioners - all split type. The quality of air could be improved to be equal to present plasma system. Development processes were as follows: 1) to study the plasma system used in the air conditioners, 2) to design a plasma generator, 3) to develop the plasma generator, and 4) to test its performance in many types of the air conditioners. This plasma system was developed by AC high voltage – 14 kv with a frequency of 50 kHz. Carbon was a conductor to generate arc in air purifier system. The research was tested by installing the plasma generator in the air conditioners - wall type. Whereas, there were 3 types of installations: air flow out, air flow in, and room center. The result of the plasma generator installed in the air conditioners, split type, revealed that the air flow out installation provided the highest average of o-zone at 223 mg/h. This type of installation provided the highest efficiency of air quality improvement. Moreover, the air flow in installation and the room center installation provided the average of the o-zone at 163 mg/h and 64 mg/h, respectively.

Keywords: Air Conditioner, Plasma generator, High voltage, Optimization, Installation position.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
4752 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: Fake news detection, feature selection, support vector machine, K-means clustering, machine learning, social media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4524
4751 Amplitude and Phase Analysis of EEG Signal by Complex Demodulation

Authors: Sun K. Yoo, Hee Cheol Kang

Abstract:

Analysis of amplitude and phase characteristics for delta, theta, and alpha bands at localized time instant from EEG signals is important for the characterizing information processing in the brain. In this paper, complex demodulation method was used to analyze EEG (Electroencephalographic) signal, particularly for auditory evoked potential response signal, with sufficient time resolution and designated frequency bandwidth resolution required. The complex demodulation decomposes raw EEG signal into 3 designated delta, theta, and alpha bands with complex EEG signal representation at sampled time instant, which can enable the extraction of amplitude envelope and phase information. Throughout simulated test data, and real EEG signal acquired during auditory attention task, it can extract the phase offset, phase and frequency changing instant and decomposed amplitude envelope for delta, theta, and alpha bands. The complex demodulation technique can be efficiently used in brain signal analysis in case of phase, and amplitude information required.

Keywords: EEG, Complex Demodulation, Amplitude, Phase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4756
4750 An Authentication Protocol for Quantum Enabled Mobile Devices

Authors: Natarajan Venkatachalam, Subrahmanya V. R. K. Rao, Vijay Karthikeyan Dhandapani, Swaminathan Saravanavel

Abstract:

The quantum communication technology is an evolving design which connects multiple quantum enabled devices to internet for secret communication or sensitive information exchange. In future, the number of these compact quantum enabled devices will increase immensely making them an integral part of present communication systems. Therefore, safety and security of such devices is also a major concern for us. To ensure the customer sensitive information will not be eavesdropped or deciphered, we need a strong authentications and encryption mechanism. In this paper, we propose a mutual authentication scheme between these smart quantum devices and server based on the secure exchange of information through quantum channel which gives better solutions for symmetric key exchange issues. An important part of this work is to propose a secure mutual authentication protocol over the quantum channel. We show that our approach offers robust authentication protocol and further our solution is lightweight, scalable, cost-effective with optimized computational processing overheads.

Keywords: Quantum cryptography, quantum key distribution, wireless quantum communication, authentication protocol, quantum enabled device, trusted third party.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1222
4749 Classifying Bio-Chip Data using an Ant Colony System Algorithm

Authors: Minsoo Lee, Yearn Jeong Kim, Yun-mi Kim, Sujeung Cheong, Sookyung Song

Abstract:

Bio-chips are used for experiments on genes and contain various information such as genes, samples and so on. The two-dimensional bio-chips, in which one axis represent genes and the other represent samples, are widely being used these days. Instead of experimenting with real genes which cost lots of money and much time to get the results, bio-chips are being used for biological experiments. And extracting data from the bio-chips with high accuracy and finding out the patterns or useful information from such data is very important. Bio-chip analysis systems extract data from various kinds of bio-chips and mine the data in order to get useful information. One of the commonly used methods to mine the data is classification. The algorithm that is used to classify the data can be various depending on the data types or number characteristics and so on. Considering that bio-chip data is extremely large, an algorithm that imitates the ecosystem such as the ant algorithm is suitable to use as an algorithm for classification. This paper focuses on finding the classification rules from the bio-chip data using the Ant Colony algorithm which imitates the ecosystem. The developed system takes in consideration the accuracy of the discovered rules when it applies it to the bio-chip data in order to predict the classes.

Keywords: Ant Colony System, DNA chip data, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
4748 Analysis of DNA from Fired Cartridge Casings

Authors: S. Mawlood, L. Dennany, N. Watson, B. Pickard

Abstract:

DNA analysis has been widely accepted as providing valuable evidence concerning the identity of the source of biological traces. Our work has showed that DNA samples can survive on cartridges even after firing. The study also raised the possibility of determining other information such as the age of the donor. Such information may be invaluable in certain cases where spent cartridges from automatic weapons are left behind at the scene of a crime. In spite of the nature of touch evidence and exposure to high chamber temperatures during shooting, we were still capable to retrieve enough DNA for profile typing. In order to estimate age of contributor, DNA methylation levels were analyzed using EpiTect system for retrieved DNA. However, results were not conclusive, due to low amount of input DNA.

Keywords: Age prediction, Fired cartridge, Trace DNA sample.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3910
4747 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain

Authors: M. Pushparani, A. Sagaya

Abstract:

Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.

Keywords: Embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1182
4746 IS Flexibility Planning for IT/Business Strategy Alignment via Future Oriented POC Analysis

Authors: Masaru Furukawa, Shigeki Hirobayashi, Tadanobu Misawa

Abstract:

Nowadays, IT/Business strategy alignment is still a key topic of concern among managers worldwide. Change has always being considered the primary challenge affecting the strategy alignment. Planning for alignment in uncertain and dynamic changing environments is burdened with risk as organizations seek to understand how much flexibility to build in their management information system so as to maintain high levels of alignment. The literature review showed that there is a tight relationship between IT infrastructure flexibility and the strategy alignment with strategic information systems (SIS) planning serving as a moderator of this relationship, and that emphasized the needs for organizations to use SIS planning consistently and to monitor the relationship between IS flexibility and the alignment. This paper presents the procedure of SIS planning with IS flexibility renovation via future oriented analysis of POC (penalty of change) as a function of cost and time. Using this SIS planning and monitoring IS flexibility and the alignment during periods of increased change in dynamic and uncertain environments reduces the risk that could transform IT into an inhibitor rather than an enabler of change.

Keywords: IT/Business strategy alignment, strategic information systems (SIS) planning, IS flexibility, penalty of change (POC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
4745 Ranking of Inventory Policies Using Distance Based Approach Method

Authors: Gupta Amit, Kumar Ramesh, Tewari P. C.

Abstract:

Globalization is putting enormous pressure on the business organizations specially manufacturing one to rethink the supply chain in innovative manners. Inventory consumes major portion of total sale revenue. Effective and efficient inventory management plays a vital role for the successful functioning of any organization. Selection of inventory policy is one of the important purchasing activities. This paper focuses on selection and ranking of alternative inventory policies. A deterministic quantitative model based on Distance Based Approach (DBA) method has been developed for evaluation and ranking of inventory policies. We have employed this concept first time for this type of the selection problem. Four inventory policies economic order quantity (EOQ), just in time (JIT), vendor managed inventory (VMI) and monthly policy are considered. Improper selection could affect a company’s competitiveness in terms of the productivity of its facilities and quality of its products. The ranking of inventory policies is a multi-criteria problem. There is a need to first identify the selection criteria and then processes the information with reference to relative importance of attributes for comparison. Criteria values for each inventory policy can be obtained either analytically or by using a simulation technique or they are linguistic subjective judgments defined by fuzzy sets, like, for example, the values of criteria. A methodology is developed and applied to rank the inventory policies.

Keywords: Inventory Policy, Ranking, DBA, Selection criteria.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
4744 General Awareness of Teenagers in Information Security

Authors: Magdalena Naplavova, Tomas Ludik, Petr Hruza, Frantisek Bozek

Abstract:

The use of IT equipment has become a part of every day. However, each device that is part of cyberspace should be secured against unauthorized use. It is very important to know the basics of these security devices, but also the basics of safe conduct their owners. This information should be part of every curriculum computer science education in primary and secondary schools. Therefore, the work focuses on the education of pupils in primary and secondary schools on the Internet. Analysis of the current state describes approaches to the education of pupils in security issues on the Internet. The paper presents a questionnaire-based survey which was carried out in the Czech Republic, whose task was to ascertain the level of opinion pupils in primary and secondary schools on the issue of communication in social networks. The research showed that awareness of socio-pathological phenomena on the Internet environment is very low. Based on the results it was proposed appropriate ways of teaching to this issue and its inclusion a proposal of curriculum for primary and secondary schools.

Keywords: Cyberspace, educational system, general awareness, information security, questionnaire, socio-pathological phenomena.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2347
4743 GIS-based Non-point Sources of Pollution Simulation in Cameron Highlands, Malaysia

Authors: M. Eisakhani, A. Pauzi, O. Karim, A. Malakahmad, S.R. Mohamed Kutty, M. H. Isa

Abstract:

Cameron Highlands is a mountainous area subjected to torrential tropical showers. It extracts 5.8 million liters of water per day for drinking supply from its rivers at several intake points. The water quality of rivers in Cameron Highlands, however, has deteriorated significantly due to land clearing for agriculture, excessive usage of pesticides and fertilizers as well as construction activities in rapidly developing urban areas. On the other hand, these pollution sources known as non-point pollution sources are diverse and hard to identify and therefore they are difficult to estimate. Hence, Geographical Information Systems (GIS) was used to provide an extensive approach to evaluate landuse and other mapping characteristics to explain the spatial distribution of non-point sources of contamination in Cameron Highlands. The method to assess pollution sources has been developed by using Cameron Highlands Master Plan (2006-2010) for integrating GIS, databases, as well as pollution loads in the area of study. The results show highest annual runoff is created by forest, 3.56 × 108 m3/yr followed by urban development, 1.46 × 108 m3/yr. Furthermore, urban development causes highest BOD load (1.31 × 106 kgBOD/yr) while agricultural activities and forest contribute the highest annual loads for phosphorus (6.91 × 104 kgP/yr) and nitrogen (2.50 × 105 kgN/yr), respectively. Therefore, best management practices (BMPs) are suggested to be applied to reduce pollution level in the area.

Keywords: Cameron Highlands, Land use, Non-point Sources of Pollution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2876
4742 Theoretical Exploration for the Impact of Accounting for Special Methods in Connectivity-Based Cohesion Measurement

Authors: Jehad Al Dallal

Abstract:

Class cohesion is a key object-oriented software quality attribute that is used to evaluate the degree of relatedness of class attributes and methods. Researchers have proposed several class cohesion measures. However, the effect of considering the special methods (i.e., constructors, destructors, and access and delegation methods) in cohesion calculation is not thoroughly theoretically studied for most of them. In this paper, we address this issue for three popular connectivity-based class cohesion measures. For each of the considered measures we theoretically study the impact of including or excluding special methods on the values that are obtained by applying the measure. This study is based on analyzing the definitions and formulas that are proposed for the measures. The results show that including/excluding special methods has a considerable effect on the obtained cohesion values and that this effect varies from one measure to another. For each of the three connectivity-based measures, the proposed theoretical study recommended excluding the special methods in cohesion measurement.

Keywords: Object-oriented class, software quality, class cohesion measure, class cohesion, special methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
4741 Bridging the Green-Value-Gap: A South African Approach

Authors: E.J. Cilliers

Abstract:

Green- spaces might be very attractive, but where are the economic benefits? What value do nature and landscape have for us? What difference will it make to jobs, health and the economic strength of areas struggling with deprivation and social problems? [1].There is a need to consider green spaces from a different perspective. Green planning is not just about flora and fauna, but also about planning for economic benefits [2]. It is worth trying to quantify the value of green spaces since nature and landscape are crucially important to our quality of life and sustainable development. The reality, however, is that urban development often takes place at the expense of green spaces. Urbanization is an ongoing process throughout the world; however, hyper-urbanization without environmental planning is destructive, not constructive [3]. Urban spaces are believed to be more valuable than other land uses, particular green areas, simply because of the market value connected to urban spaces. However, attractive landscapes can help raise the quality and value of the urban market even more. In order to reach these objectives of integrated planning, the Green-Value-Gap needs to be bridged. Economists have to understand the concept of Green-Planning and the spinoffs, and Environmentalists have to understand the importance of urban economic development and the benefits thereof to green planning. An interface between Environmental Management, Economic Development and sustainable Spatial Planning are needed to bridge the Green-Value-Gap.

Keywords: Spatial Planning, Environmental Management, Green-Value-Gap, Compensation, Participation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2640
4740 A Holistic Conceptual Measurement Framework for Assessing the Effectiveness and Viability of an Academic Program

Authors: Munir Majdalawieh, Adam Marks

Abstract:

In today’s very competitive higher education industry (HEI), HEIs are faced with the primary concern of developing, deploying, and sustaining high quality academic programs. Today, the HEI has well-established accreditation systems endorsed by a country’s legislation and institutions. The accreditation system is an educational pathway focused on the criteria and processes for evaluating educational programs. Although many aspects of the accreditation process highlight both the past and the present (prove), the “program review” assessment is "forward-looking assessment" (improve) and thus transforms the process into a continuing assessment activity rather than a periodic event. The purpose of this study is to propose a conceptual measurement framework for program review to be used by HEIs to undertake a robust and targeted approach to proactively and continuously review their academic programs to evaluate its practicality and effectiveness as well as to improve the education of the students. The proposed framework consists of two main components: program review principles and the program review measurement matrix.

Keywords: Academic program, program review principles, curriculum development, accreditation, evaluation, assessment, review measurement matrix, program review process, information technologies supporting learning, learning/teaching methodologies and assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1083
4739 Adaptive Educational Hypermedia System for High School Students Based on Learning Styles

Authors: Stephen Akuma, Timothy Ndera

Abstract:

Information seekers get “lost in hyperspace” due to the voluminous documents updated daily on the internet. Adaptive Hypermedia Systems (AHS) are used to direct learners to their target goals. One of the most common AHS designed to help information seekers to overcome the problem of information overload is the Adaptive Education Hypermedia System (AEHS). However, this paper focuses on AEHS that adopts the learning preference of high school students and deliver learning content according to this preference throughout their learning experience. The research developed a prototype system for predicting students’ learning preference from the Visual, Aural, Read-Write and Kinesthetic (VARK) learning style model and adopting the learning content suitable to their preference. The predicting strength of several classifiers was compared and we found Support Vector Machine (SVM) to be more accurate in predicting learning style based on users’ preferences.

Keywords: Hypermedia, adaptive education, learning style, lesson content, user profile, prediction, feedback, adaptive hypermedia, learning style.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 847
4738 Providing Medical Information in Braille: Research and Development of Automatic Braille Translation Program for Japanese “eBraille“

Authors: Aki Sugano, Mika Ohta, Mineko Ikegami, Kenji Miura, Sayo Tsukamoto, Akihiro Ichinose, Toshiko Ohshima, Eiichi Maeda, Masako Matsuura, Yutaka Takao

Abstract:

Along with the advances in medicine, providing medical information to individual patient is becoming more important. In Japan such information via Braille is hardly provided to blind and partially sighted people. Thus we are researching and developing a Web-based automatic translation program “eBraille" to translate Japanese text into Japanese Braille. First we analyzed the Japanese transcription rules to implement them on our program. We then added medical words to the dictionary of the program to improve its translation accuracy for medical text. Finally we examined the efficacy of statistical learning models (SLMs) for further increase of word segmentation accuracy in braille translation. As a result, eBraille had the highest translation accuracy in the comparison with other translation programs, improved the accuracy for medical text and is utilized to make hospital brochures in braille for outpatients and inpatients.

Keywords: Automatic Braille translation, Medical text, Partially sighted people.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601
4737 A Classification Scheme for Game Input and Output

Authors: P. Prema, B. Ramadoss

Abstract:

Computer game industry has experienced exponential growth in recent years. A game is a recreational activity involving one or more players. Game input is information such as data, commands, etc., which is passed to the game system at run time from an external source. Conversely, game outputs are information which are generated by the game system and passed to an external target, but which is not used internally by the game. This paper identifies a new classification scheme for game input and output, which is based on player-s input and output. Using this, relationship table for game input classifier and output classifier is developed.

Keywords: Game Classification, Game Input, Game Output, Game Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
4736 Novel NMR-Technology to Assess Food Quality and Safety

Authors: Markus Link, Manfred Spraul, Hartmut Schaefer, Fang Fang, Birk Schuetz

Abstract:

High Resolution NMR Spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis.

The objective is to demonstrate, that due to its extreme reproducibility NMR can detect smallest changes in concentrations of many components in a mixture, which is best monitored by statistical evaluation however also delivers reliable quantification results.

The methodology typically uses a 400 MHz high resolution instrument under full automation after minimized sample preparation.

For example one fruit juice analysis in a push button operation takes at maximum 15 minutes and delivers a multitude of results, which are automatically summarized in a PDF report.

The method has been proven on fruit juices, where so far unknown frauds could be detected. In addition conventional targeted parameters are obtained in the same analysis. This technology has the advantage that NMR is completely quantitative and concentration calibration only has to be done once for all compounds. Since NMR is so reproducible, it is also transferable between different instruments (with same field strength) and laboratories. Based on strict SOP`s, statistical models developed once can be used on multiple instruments and strategies for compound identification and quantification are applicable as well across labs.

Keywords: Automated solution, NMR, non-targeted screening, targeted screening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2248
4735 Mango (Mangifera indica L.) Lyophilization Using Vacuum-Induced Freezing

Authors: Natalia A. Salazar, Erika K. Méndez, Catalina Álvarez, Carlos E. Orrego

Abstract:

Lyophilization, also called freeze-drying, is an important dehydration technique mainly used for pharmaceuticals. Food industry also uses lyophilization when it is important to retain most of the nutritional quality, taste, shape and size of dried products and to extend their shelf life. Vacuum-Induced during freezing cycle (VI) has been used in order to control ice nucleation and, consequently, to reduce the time of primary drying cycle of pharmaceuticals preserving quality properties of the final product. This procedure has not been applied in freeze drying of foods. The present work aims to investigate the effect of VI on the lyophilization drying time, final moisture content, density and reconstitutional properties of mango (Mangifera indica L.) slices (MS) and mango pulp-maltodextrin dispersions (MPM) (30% concentration of total solids). Control samples were run at each freezing rate without using induced vacuum. The lyophilization endpoint was the same for all treatments (constant difference between capacitance and Pirani vacuum gauges). From the experimental results it can be concluded that at the high freezing rate (0.4°C/min) reduced the overall process time up to 30% comparing process time required for the control and VI of the lower freeze rate (0.1°C/min) without affecting the quality characteristics of the dried product, which yields a reduction in costs and energy consumption for MS and MPM freeze drying. Controls and samples treated with VI at freezing rate of 0.4°C/min in MS showed similar results in moisture and density parameters. Furthermore, results from MPM dispersion showed favorable values when VI was applied because dried product with low moisture content and low density was obtained at shorter process time compared with the control. There were not found significant differences between reconstitutional properties (rehydration for MS and solubility for MPM) of freeze dried mango resulting from controls, and VI treatments.

Keywords: Drying time, lyophilization, mango, vacuum induced freezing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2269
4734 Investigating the Dynamic Response of the Ballast

Authors: Osama Brinji, Wing Kong Chiu, Graham Tew

Abstract:

Understanding the stability of rail ballast is one of the most important aspects in the railways. An unstable track may cause some issues such as unnecessary vibration and ultimately loss of track quality. The track foundation plays an important role in the stabilization of the railway. The dynamic response of rail ballast in the vicinity of the rail sleeper can affect the stability of the rail track and this has not been studied in detail. A review of literature showed that most of the works focused on the area under the concrete sleeper. Although there are some theories about the shear (longitudinal) effect of the rail ballast, these have not properly been studied and hence are not well understood. The stability of a rail track will depend on the compactness of the ballast in its vicinity. This paper will try to determine the dynamic response of the ballast to identify its resonant behaviour. This preliminary research is one of several studies that examine the vibration response of the granular materials. The main aim is to use this information for future design of sleepers to ensure that any dynamic response of the sleeper will not compromise the state of compactness of the ballast. This paper will report on the dependence of damping and the natural frequency of the ballast as a function of depth and distance from the point of excitation introduced through a concrete block. The concrete block is used to simulate a sleeper and the ballast is simulated with gravel. In spite of these approximations, the results presented in the paper will show an agreement with theories and the assumptions that are used in study the mechanical behaviour of the rail ballast.

Keywords: Ballast, dynamic response, sleeper, stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650
4733 Hi-Fi Traffic Clearance Technique for Life Saving Vehicles using Differential GPS System

Authors: N. Yuvaraj, V. B. Prakash, D. Venkatraj

Abstract:

This paper may be considered as combination of both pervasive computing and Differential GPS (global positioning satellite) which relates to control automatic traffic signals in such a way as to pre-empt normal signal operation and permit lifesaving vehicles. Before knowing the arrival of the lifesaving vehicles from the signal there is a chance of clearing the traffic. Traffic signal preemption system includes a vehicle equipped with onboard computer system capable of capturing diagnostic information and estimated location of the lifesaving vehicle using the information provided by GPS receiver connected to the onboard computer system and transmitting the information-s using a wireless transmitter via a wireless network. The fleet management system connected to a wireless receiver is capable of receiving the information transmitted by the lifesaving vehicle .A computer is also located at the intersection uses corrected vehicle position, speed & direction measurements, in conjunction with previously recorded data defining approach routes to the intersection, to determine the optimum time to switch a traffic light controller to preemption mode so that lifesaving vehicles can pass safely. In case when the ambulance need to take a “U" turn in a heavy traffic area we suggest a solution. Now we are going to make use of computerized median which uses LINKED BLOCKS (removable) to solve the above problem.

Keywords: Ubiquitous computing, differential GPS, fleet management system, wireless transmitter and receiver computerized median i.e. linked blocks (removable).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
4732 Perceptual JPEG Compliant Coding by Using DCT-Based Visibility Thresholds of Color Images

Authors: Kuo-Cheng Liu

Abstract:

Effective estimation of just noticeable distortion (JND) for images is helpful to increase the efficiency of a compression algorithm in which both the statistical redundancy and the perceptual redundancy should be accurately removed. In this paper, we design a DCT-based model for estimating JND profiles of color images. Based on a mathematical model of measuring the base detection threshold for each DCT coefficient in the color component of color images, the luminance masking adjustment, the contrast masking adjustment, and the cross masking adjustment are utilized for luminance component, and the variance-based masking adjustment based on the coefficient variation in the block is proposed for chrominance components. In order to verify the proposed model, the JND estimator is incorporated into the conventional JPEG coder to improve the compression performance. A subjective and fair viewing test is designed to evaluate the visual quality of the coding image under the specified viewing condition. The simulation results show that the JPEG coder integrated with the proposed DCT-based JND model gives better coding bit rates at visually lossless quality for a variety of color images.

Keywords: Just-noticeable distortion (JND), discrete cosine transform (DCT), JPEG.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2581
4731 Extended Deductive Databases with Uncertain Information

Authors: Daniel Stamate

Abstract:

The paper presents an approach for handling uncertain information in deductive databases using multivalued logics. Uncertainty means that database facts may be assigned logical values other than the conventional ones - true and false. The logical values represent various degrees of truth, which may be combined and propagated by applying the database rules. A corresponding multivalued database semantics is defined. We show that it extends successful conventional semantics as the well-founded semantics, and has a polynomial time data complexity.

Keywords: Reasoning under uncertainty, multivalued logics, deductive databases, logic programs, multivalued semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347
4730 Knowledge Management Strategies within a Corporate Environment of Papers

Authors: Daniel J. Glauber

Abstract:

Knowledge transfer between personnel could benefit an organization’s improved competitive advantage in the marketplace from a strategic approach to knowledge management. The lack of information sharing between personnel could create knowledge transfer gaps while restricting the decision-making processes. Knowledge transfer between personnel can potentially improve information sharing based on an implemented knowledge management strategy. An organization’s capacity to gain more knowledge is aligned with the organization’s prior or existing captured knowledge. This case study attempted to understand the overall influence of a KMS within the corporate environment and knowledge exchange between personnel. The significance of this study was to help understand how organizations can improve the Return on Investment (ROI) of a knowledge management strategy within a knowledge-centric organization. A qualitative descriptive case study was the research design selected for this study. The lack of information sharing between personnel may create knowledge transfer gaps while restricting the decision-making processes. Developing a knowledge management strategy acceptable at all levels of the organization requires cooperation in support of a common organizational goal. Working with management and executive members to develop a protocol where knowledge transfer becomes a standard practice in multiple tiers of the organization. The knowledge transfer process could be measurable when focusing on specific elements of the organizational process, including personnel transition to help reduce time required understanding the job. The organization studied in this research acknowledged the need for improved knowledge management activities within the organization to help organize, retain, and distribute information throughout the workforce. Data produced from the study indicate three main themes including information management, organizational culture, and knowledge sharing within the workforce by the participants. These themes indicate a possible connection between an organizations KMS, the organizations culture, knowledge sharing, and knowledge transfer.

Keywords: Knowledge management strategies, knowledge transfer, knowledge management, knowledge capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
4729 EAAC: Energy-Aware Admission Control Scheme for Ad Hoc Networks

Authors: Dilip Kumar S.M, Vijaya Kumar B.P.

Abstract:

The decisions made by admission control algorithms are based on the availability of network resources viz. bandwidth, energy, memory buffers, etc., without degrading the Quality-of-Service (QoS) requirement of applications that are admitted. In this paper, we present an energy-aware admission control (EAAC) scheme which provides admission control for flows in an ad hoc network based on the knowledge of the present and future residual energy of the intermediate nodes along the routing path. The aim of EAAC is to quantify the energy that the new flow will consume so that it can be decided whether the future residual energy of the nodes along the routing path can satisfy the energy requirement. In other words, this energy-aware routing admits a new flow iff any node in the routing path does not run out of its energy during the transmission of packets. The future residual energy of a node is predicted using the Multi-layer Neural Network (MNN) model. Simulation results shows that the proposed scheme increases the network lifetime. Also the performance of the MNN model is presented.

Keywords: Ad hoc networks, admission control, energy-aware routing, Quality-of-Service, future residual energy, neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647