Search results for: information and communication technologies.
3608 Automatic Authentication of Handwritten Documents via Low Density Pixel Measurements
Authors: Abhijit Mitra, Pranab Kumar Banerjee, C. Ardil
Abstract:
We introduce an effective approach for automatic offline au- thentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and den- sity ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements.Keywords: Handwritten document verification, Skilled forgeries, Low density pixels, Adaptive decision boundary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17153607 The Visualizer for Real-Time Analysis of Internet Trends
Authors: Radek Malinský, Ivan Jelínek
Abstract:
The current web has become a modern encyclopedia, where people share their thoughts and ideas on various topics around them. This kind of encyclopedia is very useful for other people who are looking for answers to their questions. However, with the growing popularity of social networking and blogging and ever expanding network services, there has also been a growing diversity of technologies along with a different structure of individual web sites. It is therefore difficult to directly find a relevant answer for a common Internet user. This paper presents a web application for the real-time end-to-end analysis of selected Internet trends where the trend can be whatever the people post online. The application integrates fully configurable tools for data collection and analysis using selected webometric algorithms, and for its chronological visualization to user. It can be assumed that the application facilitates the users to evaluate the quality of various products that are mentioned online.Keywords: Trend, visualizer, web analysis, web 2.0.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22383606 The System Architecture of the Open European Nephrology Science Centre
Authors: G. Lindemann, D. Schmidt, T. Schrader, M. Beil, T. Schaaf, H.-D. Burkhard
Abstract:
The amount and heterogeneity of data in biomedical research, notably in interdisciplinary research, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charite Medical School in Berlin has established together with the German Research Foundation (DFG) a new information service center for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). The system is based on a service-oriented architecture (SOA) with main and auxiliary modules arranged in four layers. To improve the reuse and efficient arrangement of the services the functionalities are described as business processes using the standardised Business Process Execution Language (BPEL).
Keywords: Software development management, Business dataprocessing, Knowledge based systems in medicine
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14723605 High Resolution Images: Segmenting, Extracting Information and GIS Integration
Authors: Erick López-Ornelas
Abstract:
As the world changes more rapidly, the demand for update information for resource management, environment monitoring, planning are increasing exponentially. Integration of Remote Sensing with GIS technology will significantly promote the ability for addressing these concerns. This paper presents an alternative way of update GIS applications using image processing and high resolution images. We show a method of high-resolution image segmentation using graphs and morphological operations, where a preprocessing step (watershed operation) is required. A morphological process is then applied using the opening and closing operations. After this segmentation we can extract significant cartographic elements such as urban areas, streets or green areas. The result of this segmentation and this extraction is then used to update GIS applications. Some examples are shown using aerial photography.
Keywords: GIS, Remote Sensing, image segmentation, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16413604 Text Summarization for Oil and Gas Drilling Topic
Authors: Y. Y. Chen, O. M. Foong, S. P. Yong, Kurniawan Iwan
Abstract:
Information sharing and gathering are important in the rapid advancement era of technology. The existence of WWW has caused rapid growth of information explosion. Readers are overloaded with too many lengthy text documents in which they are more interested in shorter versions. Oil and gas industry could not escape from this predicament. In this paper, we develop an Automated Text Summarization System known as AutoTextSumm to extract the salient points of oil and gas drilling articles by incorporating statistical approach, keywords identification, synonym words and sentence-s position. In this study, we have conducted interviews with Petroleum Engineering experts and English Language experts to identify the list of most commonly used keywords in the oil and gas drilling domain. The system performance of AutoTextSumm is evaluated using the formulae of precision, recall and F-score. Based on the experimental results, AutoTextSumm has produced satisfactory performance with F-score of 0.81.
Keywords: Keyword's probability, synonym sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17303603 An Experimental Multi-Agent Robot System for Operating in Hazardous Environments
Authors: Y. J. Huang, J. D. Yu, B. W. Hong, C. H. Tai, T. C. Kuo
Abstract:
In this paper, a multi-agent robot system is presented. The system consists of four robots. The developed robots are able to automatically enter and patrol a harmful environment, such as the building infected with virus or the factory with leaking hazardous gas. Further, every robot is able to perform obstacle avoidance and search for the victims. Several operation modes are designed: remote control, obstacle avoidance, automatic searching, and so on.
Keywords: autonomous robot, field programmable gate array, obstacle avoidance, ultrasonic sensor, wireless communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17773602 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks
Authors: Naghmeh Moradpoor Sheykhkanloo
Abstract:
Thousands of organisations store important and confidential information related to them, their customers, and their business partners in databases all across the world. The stored data ranges from less sensitive (e.g. first name, last name, date of birth) to more sensitive data (e.g. password, pin code, and credit card information). Losing data, disclosing confidential information or even changing the value of data are the severe damages that Structured Query Language injection (SQLi) attack can cause on a given database. It is a code injection technique where malicious SQL statements are inserted into a given SQL database by simply using a web browser. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLi attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLi attack categories, and a NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLi attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.Keywords: Neural Networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28423601 A Case Study on Theme-Based Approach in Health Technology Engineering Education: Customer Oriented Software Applications
Authors: Mikael Soini, Kari Björn
Abstract:
Metropolia University of Applied Sciences (MUAS) Information and Communication Technology (ICT) Degree Programme provides full-time Bachelor-level undergraduate studies. ICT Degree Programme has seven different major options; this paper focuses on Health Technology. In Health Technology, a significant curriculum change in 2014 enabled transition from fragmented curriculum including dozens of courses to a new integrated curriculum built around three 30 ECTS themes. This paper focuses especially on the second theme called Customer Oriented Software Applications. From students’ point of view, the goal of this theme is to get familiar with existing health related ICT solutions and systems, understand business around health technology, recognize social and healthcare operating principles and services, and identify customers and users and their special needs and perspectives. This also acts as a background for health related web application development. Built web application is tested, developed and evaluated with real users utilizing versatile user centred development methods. This paper presents experiences obtained from the first implementation of Customer Oriented Software Applications theme. Student feedback was gathered with two questionnaires, one in the middle of the theme and other at the end of the theme. Questionnaires had qualitative and quantitative parts. Similar questionnaire was implemented in the first theme; this paper evaluates how the theme-based integrated curriculum has progressed in Health Technology major by comparing results between theme 1 and 2. In general, students were satisfied for the implementation, timing and synchronization of the courses, and the amount of work. However there is still room for development. Student feedback and teachers’ observations have been and will be used to develop the content and operating principles of the themes and whole curriculum.
Keywords: Engineering education, integrated and theme-based curriculum, learning experience, student centred learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8463600 Stochastic Comparisons of Heterogeneous Samples with Homogeneous Exponential Samples
Authors: Nitin Gupta, Rakesh Kumar Bajaj
Abstract:
In the present communication, stochastic comparison of a series (parallel) system having heterogeneous components with random lifetimes and series (parallel) system having homogeneous exponential components with random lifetimes has been studied. Further, conditions under which such a comparison is possible has been established.Keywords: Exponential distribution, Order statistics, Star ordering, Stochastic ordering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15633599 A Comparative Study of Fine Grained Security Techniques Based on Data Accessibility and Inference
Authors: Azhar Rauf, Sareer Badshah, Shah Khusro
Abstract:
This paper analyzes different techniques of the fine grained security of relational databases for the two variables-data accessibility and inference. Data accessibility measures the amount of data available to the users after applying a security technique on a table. Inference is the proportion of information leakage after suppressing a cell containing secret data. A row containing a secret cell which is suppressed can become a security threat if an intruder generates useful information from the related visible information of the same row. This paper measures data accessibility and inference associated with row, cell, and column level security techniques. Cell level security offers greatest data accessibility as it suppresses secret data only. But on the other hand, there is a high probability of inference in cell level security. Row and column level security techniques have least data accessibility and inference. This paper introduces cell plus innocent security technique that utilizes the cell level security method but suppresses some innocent data to dodge an intruder that a suppressed cell may not necessarily contain secret data. Four variations of the technique namely cell plus innocent 1/4, cell plus innocent 2/4, cell plus innocent 3/4, and cell plus innocent 4/4 respectively have been introduced to suppress innocent data equal to 1/4, 2/4, 3/4, and 4/4 percent of the true secret data inside the database. Results show that the new technique offers better control over data accessibility and inference as compared to the state-of-theart security techniques. This paper further discusses the combination of techniques together to be used. The paper shows that cell plus innocent 1/4, 2/4, and 3/4 techniques can be used as a replacement for the cell level security.
Keywords: Fine Grained Security, Data Accessibility, Inference, Row, Cell, Column Level Security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14703598 Construction 4.0: The Future of the Construction Industry in South Africa
Authors: Temidayo. O. Osunsanmi, Clinton Aigbavboa, Ayodeji Oke
Abstract:
The construction industry is a renowned latecomer to the efficiency offered by the adoption of information technology. Whereas, the banking, manufacturing, retailing industries have keyed into the future by using digitization and information technology as a new approach for ensuring competitive gain and efficiency. The construction industry has yet to fully realize similar benefits because the adoption of ICT is still at the infancy stage with a major concentration on the use of software. Thus, this study evaluates the awareness and readiness of construction professionals towards embracing a full digitalization of the construction industry using construction 4.0. The term ‘construction 4.0’ was coined from the industry 4.0 concept which is regarded as the fourth industrial revolution that originated from Germany. A questionnaire was utilized for sourcing data distributed to practicing construction professionals through a convenience sampling method. Using SPSS v24, the hypotheses posed were tested with the Mann Whitney test. The result revealed that there are no differences between the consulting and contracting organizations on the readiness for adopting construction 4.0 concepts in the construction industry. Using factor analysis, the study discovers that adopting construction 4.0 will improve the performance of the construction industry regarding cost and time savings and also create sustainable buildings. In conclusion, the study determined that construction professionals have a low awareness towards construction 4.0 concepts. The study recommends an increase in awareness of construction 4.0 concepts through seminars, workshops and training, while construction professionals should take hold of the benefits of adopting construction 4.0 concepts. The study contributes to the roadmap for the implementation of construction industry 4.0 concepts in the South African construction industry.
Keywords: Building information technology, Construction 4.0, Industry 4.0, Smart Site.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 58103597 Parallel Explicit Group Domain Decomposition Methods for the Telegraph Equation
Authors: Kew Lee Ming, Norhashidah Hj. Mohd. Ali
Abstract:
In a previous work, we presented the numerical solution of the two dimensional second order telegraph partial differential equation discretized by the centred and rotated five-point finite difference discretizations, namely the explicit group (EG) and explicit decoupled group (EDG) iterative methods, respectively. In this paper, we utilize a domain decomposition algorithm on these group schemes to divide the tasks involved in solving the same equation. The objective of this study is to describe the development of the parallel group iterative schemes under OpenMP programming environment as a way to reduce the computational costs of the solution processes using multicore technologies. A detailed performance analysis of the parallel implementations of points and group iterative schemes will be reported and discussed.Keywords: Telegraph equation, explicit group iterative scheme, domain decomposition algorithm, parallelization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15243596 Web Usability : A Fuzzy Approach to the Navigation Structure Enhancement in a Website System, Case of Iranian Civil Aviation Organization Website
Authors: Hamed Qahri Saremi, Gholam Ali Montazer
Abstract:
With the proliferation of World Wide Web, development of web-based technologies and the growth in web content, the structure of a website becomes more complex and web navigation becomes a critical issue to both web designers and users. In this paper we define the content and web pages as two important and influential factors in website navigation and paraphrase the enhancement in the website navigation as making some useful changes in the link structure of the website based on the aforementioned factors. Then we suggest a new method for proposing the changes using fuzzy approach to optimize the website architecture. Applying the proposed method to a real case of Iranian Civil Aviation Organization (CAO) website, we discuss the results of the novel approach at the final section.Keywords: Web content, Web navigation, Website system, Webusage mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17853595 Land Use/Land Cover Mapping Using Landsat 8 and Sentinel-2 in a Mediterranean Landscape
Authors: M. Vogiatzis, K. Perakis
Abstract:
Spatial-explicit and up-to-date land use/land cover information is fundamental for spatial planning, land management, sustainable development, and sound decision-making. In the last decade, many satellite-derived land cover products at different spatial, spectral, and temporal resolutions have been developed, such as the European Copernicus Land Cover product. However, more efficient and detailed information for land use/land cover is required at the regional or local scale. A typical Mediterranean basin with a complex landscape comprised of various forest types, crops, artificial surfaces, and wetlands was selected to test and develop our approach. In this study, we investigate the improvement of Copernicus Land Cover product (CLC2018) using Landsat 8 and Sentinel-2 pixel-based classification based on all available existing geospatial data (Forest Maps, LPIS, Natura2000 habitats, cadastral parcels, etc.). We examined and compared the performance of the Random Forest classifier for land use/land cover mapping. In total, 10 land use/land cover categories were recognized in Landsat 8 and 11 in Sentinel-2A. A comparison of the overall classification accuracies for 2018 shows that Landsat 8 classification accuracy was slightly higher than Sentinel-2A (82,99% vs. 80,30%). We concluded that the main land use/land cover types of CLC2018, even within a heterogeneous area, can be successfully mapped and updated according to CLC nomenclature. Future research should be oriented toward integrating spatiotemporal information from seasonal bands and spectral indexes in the classification process.
Keywords: land use/land cover, random forest, Landsat-8 OLI, Sentinel-2A MSI, Corine land cover
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3363594 Data Extraction of XML Files using Searching and Indexing Techniques
Authors: Sushma Satpute, Vaishali Katkar, Nilesh Sahare
Abstract:
XML files contain data which is in well formatted manner. By studying the format or semantics of the grammar it will be helpful for fast retrieval of the data. There are many algorithms which describes about searching the data from XML files. There are no. of approaches which uses data structure or are related to the contents of the document. In these cases user must know about the structure of the document and information retrieval techniques using NLPs is related to content of the document. Hence the result may be irrelevant or not so successful and may take more time to search.. This paper presents fast XML retrieval techniques by using new indexing technique and the concept of RXML. When indexing an XML document, the system takes into account both the document content and the document structure and assigns the value to each tag from file. To query the system, a user is not constrained about fixed format of query.
Keywords: XML Retrieval, Indexed Search, Information Retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17823593 Use of a Learner's Log for Effective Self-Directed Learning in PBL
Authors: Amudha Kadirvelu, Sivalal Sadasivan
Abstract:
While the problem based learning (PBL) approach promotes unsupervised self-directed learning (SDL), many students experience difficulty juggling the role of being an information recipient and information seeker. Logbooks have been used to assess trainee doctors but not in other areas. This study aimed to determine the effectiveness of logbook for assessing SDL during PBL sessions in first year medical students. The log book included a learning checklist and knowledge and skills components. Comparisons with the baseline assessment of student performance in PBL and that at semester end after logbook intervention showed significant improvements in student performance (31.5 ± 8 vs. 17.7 ± 4.4; p<0.001) with a large effect size of 3.93. The learner-s log for PBL has played an important role in enhancing SDL in first year medical students. Learner-s log could be a good self-assessment tool for the undergraduate medical students.
Keywords: Problem based learning, self-directed learning, logbook, self-assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20133592 English Language Learning Strategies Used by University Students: A Case Study of English and Business English Major at Suan Sunandha Rajabhat in Bangkok
Authors: Pranee Pathomchaiwat
Abstract:
The purposes of this research are 1) to study English language learning strategies used by the fourth-year students majoring in English and Business English, 2) to study the English language learning strategies which have an affect on English learning achievement, and 3) to compare the English language learning strategies used by the students majoring in English and Business English. The population and sampling comprise of 139 university students of the Suan Sunandha Rajabhat University. Research instruments are language learning strategies questionnaire which was constructed by the researcher and improved on by three experts and the transcripts that show the results of English learning achievement. The questionnaire includes 1) Language Practice Strategy 2)Memory Strategy 3) Communication Strategy 4)Making an Intelligent Guess or Compensation Strategy 5) Self-discipline in Learning Management Strategy 6) Affective Strategy 7)Self-Monitoring Strategy 8) Self-studySkill Strategy. Statistics used in the study are mean, standard deviation, T-test and One Way ANOVA, Pearson product moment correlation coefficient and Regression Analysis. The results of the findings reveal that the English language learning strategies most frequently used by the students are affective strategy, making an intelligent guess or compensation strategy, self-studyskill strategy and self-monitoring strategy respectively. The aspect of making an intelligent guess or compensation strategy had the most significant affect on English learning achievement. It is found that the English language learning strategies mostly used by the Business English major students and moderately used by the English major students. Their language practice strategies uses were significantly different at the 0.05 level and their communication strategies uses were significantly different at the 0.01 level. In addition, it is found that the poor students and the fair ones most frequently used affective strategy while the good ones most frequently used making an intelligent guess or compensation strategy. KeywordsEnglish language, language learning strategies, English learning achievement, and students majoring in English, Business English. Pranee Pathomchaiwat is an Assistant Professor in Business English Program, Suan Sunandha Rajabhat University, Bangkok, Thailand (e-mail: [email protected]).Keywords: English language, language learning strategies, English learning achievement, students majoring in English, Business English
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38203591 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution
Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani S. Alghamdi
Abstract:
Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.
Keywords: Binary segmentation, change point, exponential Lomax distribution, information criterion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3393590 Understanding Success Factors of an Information Security Management System Plan Phase Self-Implementation
Authors: Nurazean Maarop, Noorjan Mohd Mustapha, Rasimah Yusoff, Roslina Ibrahim, Norziha Megat Mohd Zainuddin
Abstract:
The goal of this study is to identify success factors that could influence the ISMS self-implementation in government sector from qualitative perspective. This study is based on a case study in one of the Malaysian government agency. Semi-structured interviews involving five key informants were conducted to examine factors addressed in the conceptual framework. Subsequently, thematic analysis was executed to describe the influence of each factor on the success implementation of ISMS. The result of this study indicates that management commitment, implementer commitment and implementer competency are part of the success factors for ISMS self-implementation in Malaysian Government Sector.
Keywords: ISMS Success Factors, IT Project Management, IS Success, Information Security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42723589 Target and Equalizer Design for Perpendicular Heat-Assisted Magnetic Recording
Authors: P. Tueku, P. Supnithi, R. Wongsathan
Abstract:
Heat-Assisted Magnetic Recording (HAMR) is one of the leading technologies identified to enable areal density beyond 1 Tb/in2 of magnetic recording systems. A key challenge to HAMR designing is accuracy of positioning, timing of the firing laser, power of the laser, thermo-magnetic head, head-disk interface and cooling system. We study the effect of HAMR parameters on transition center and transition width. The HAMR is model using Thermal Williams-Comstock (TWC) and microtrack model. The target and equalizer are designed by the minimum mean square error (MMSE). The result shows that the unit energy constraint outperforms other constraints.
Keywords: Heat-Assisted Magnetic Recording, Thermal Williams-Comstock equation, Microtrack model, Equalizer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18833588 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains
Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe
Abstract:
The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.
Keywords: Digitalization, digital transformation, lean production, Industrie 4.0, value chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20323587 Social Media and Tacit Knowledge Sharing: Developing a Conceptual Model
Authors: Sirous Panahi , Jason Watson , Helen Partridge
Abstract:
With the advent of social web initiatives, some argued that these new emerging tools might be useful in tacit knowledge sharing through providing interactive and collaborative technologies. However, there is still a poverty of literature to understand how and what might be the contributions of social media in facilitating tacit knowledge sharing. Therefore, this paper is intended to theoretically investigate and map social media concepts and characteristics with tacit knowledge creation and sharing requirements. By conducting a systematic literature review, five major requirements found that need to be present in an environment that involves tacit knowledge sharing. These requirements have been analyzed against social media concepts and characteristics to see how they map together. The results showed that social media have abilities to comply some of the main requirements of tacit knowledge sharing. The relationships have been illustrated in a conceptual framework, suggesting further empirical studies to acknowledge findings of this study.Keywords: Knowledge sharing, Tacit knowledge, Social media, Web 2.0
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32203586 A Distributed Cognition Framework to Compare E-Commerce Websites Using Data Envelopment Analysis
Authors: C. lo Storto
Abstract:
This paper presents an approach based on the adoption of a distributed cognition framework and a non parametric multicriteria evaluation methodology (DEA) designed specifically to compare e-commerce websites from the consumer/user viewpoint. In particular, the framework considers a website relative efficiency as a measure of its quality and usability. A website is modelled as a black box capable to provide the consumer/user with a set of functionalities. When the consumer/user interacts with the website to perform a task, he/she is involved in a cognitive activity, sustaining a cognitive cost to search, interpret and process information, and experiencing a sense of satisfaction. The degree of ambiguity and uncertainty he/she perceives and the needed search time determine the effort size – and, henceforth, the cognitive cost amount – he/she has to sustain to perform his/her task. On the contrary, task performing and result achievement induce a sense of gratification, satisfaction and usefulness. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 40 websites of businesses performing electronic commerce in the information technology market. A questionnaire to collect subjective judgements for the websites in the sample was purposely designed and administered to 85 university students enrolled in computer science and information systems engineering undergraduate courses.Keywords: Website, e-commerce, DEA, distributed cognition, evaluation, comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17053585 Core Competence Development while Carrying out Organizational Changes
Authors: Olga A. Shvetsova
Abstract:
The paper contains the different issues of competence management in industrial companies. The theoretical bases of human resources management and practical issues of innovative enterprises’ competitiveness are considered. The research is focused on the modern industrial enterprise changes management problems; it focuses on the effective personnel management of industrial enterprises on the basis of competence approach. The influence of organizational changes on the competence development is discussed. The need for development of the new technologies is mentioned, proposal is based on competence-based approach in personnel management including in the conditions of carrying out organizational changes; methods of acquisition and development of missing key professional competences are discussed; importance of key competencies in forming competitive advantage of the organization is mentioned.Keywords: Competence model, development of industrial company, organizational changes, competitiveness, core competencies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8843584 Media Façades in the Wild: Some Lessons
Authors: Hai-Ning Liang, Xiaowei Dai, Nancy Diniz, Charles Fleming, Woon Kian Chong
Abstract:
Media displays in public areas are becoming increasingly pervasive—they are used in many settings, come in different sizes, serve different purposes, and have varied degrees of interactivity. In this paper, we aim to provide a survey of how these displays, often named media façades, are used in the wild in a city in China which is undergoing a rapid growth. This survey is intended to raise greater awareness and discussion about the use and effect of these displays in public areas. Through this survey, we have been able to distill some lessons of what is good, bad, and ugly about some current examples of media displays used in a city that is transitioning into becoming a modern one and one that is located in one of the fastest growing areas in Asia. With this research, we hope that we can provide technology designers and architects with some general principles that can help them integrate these types of technologies into their architectural creations.
Keywords: Large displays, media façades, interaction design, architectural displays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37773583 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: Web log data, web user profile, user interest, noise web data learning, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17323582 Implementation of a Serializer to Represent PHP Objects in the Extensible Markup Language
Authors: Lidia N. Hernández-Piña, Carlos R. Jaimez-González
Abstract:
Interoperability in distributed systems is an important feature that refers to the communication of two applications written in different programming languages. This paper presents a serializer and a de-serializer of PHP objects to and from XML, which is an independent library written in the PHP programming language. The XML generated by this serializer is independent of the programming language, and can be used by other existing Web Objects in XML (WOX) serializers and de-serializers, which allow interoperability with other object-oriented programming languages.Keywords: Interoperability, PHP object serialization, PHP to XML, web objects in XML, WOX.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7523581 Verification and Proposal of Information Processing Model Using EEG-Based Brain Activity Monitoring
Authors: Toshitaka Higashino, Naoki Wakamiya
Abstract:
Human beings perform a task by perceiving information from outside, recognizing them, and responding them. There have been various attempts to analyze and understand internal processes behind the reaction to a given stimulus by conducting psychological experiments and analysis from multiple perspectives. Among these, we focused on Model Human Processor (MHP). However, it was built based on psychological experiments and thus the relation with brain activity was unclear so far. To verify the validity of the MHP and propose our model from a viewpoint of neuroscience, EEG (Electroencephalography) measurements are performed during experiments in this study. More specifically, first, experiments were conducted where Latin alphabet characters were used as visual stimuli. In addition to response time, ERPs (event-related potentials) such as N100 and P300 were measured by using EEG. By comparing cycle time predicted by the MHP and latency of ERPs, it was found that N100, related to perception of stimuli, appeared at the end of the perceptual processor. Furthermore, by conducting an additional experiment, it was revealed that P300, related to decision making, appeared during the response decision process, not at the end. Second, by experiments using Japanese Hiragana characters, i.e. Japan's own phonetic symbols, those findings were confirmed. Finally, Japanese Kanji characters were used as more complicated visual stimuli. A Kanji character usually has several readings and several meanings. Despite the difference, a reading-related task and a meaning-related task exhibited similar results, meaning that they involved similar information processing processes of the brain. Based on those results, our model was proposed which reflects response time and ERP latency. It consists of three processors: the perception processor from an input of a stimulus to appearance of N100, the cognitive processor from N100 to P300, and the decision-action processor from P300 to response. Using our model, an application system which reflects brain activity can be established.
Keywords: Brain activity, EEG, information processing model, model human processor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6903580 A Methodology to Analyze Technology Convergence: Patent-Citation Based Technology Input-Output Analysis
Authors: Jeeeun Kim, Sungjoo Lee
Abstract:
This research proposes a methodology for patent-citation-based technology input-output analysis by applying the patent information to input-output analysis developed for the dependencies among different industries. For this analysis, a technology relationship matrix and its components, as well as input and technology inducement coefficients, are constructed using patent information. Then, a technology inducement coefficient is calculated by normalizing the degree of citation from certain IPCs to the different IPCs (International patent classification) or to the same IPCs. Finally, we construct a Dependency Structure Matrix (DSM) based on the technology inducement coefficient to suggest a useful application for this methodology.
Keywords: Technology spillover effect, technology relationship, IO table, technology inducement coefficients, patent analysis, patent citation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25723579 The Efficacy of Technology in Enhancing the Development and Learning of Children (0 – 5 Years)
Authors: Adesina, Olusola Joseph
Abstract:
The use of Technological tools in the classroom setting has drawn the interest of researchers all over the world in the recent time. Technology has been identified in the recent time as potentials tools to aid learning especially during early childhood stage. The main objective of this is to assist the upcoming younger generations to acquire necessary skills for cognitive development which later enhances effective teaching learning process. The integration of Technology in early childhood requires a careful selection of devices that will both assist the children and the teachers or care givers. This paper therefore, examines some selected literature evidences and highlighted the efficacy of various technologies tools in enhancing the development and learning of children (0 – 5 years). Conclusion and recommendations were also drawn in this paper.
Keywords: Development, Efficacy, Learning, Technological Device.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523