Search results for: Information Processing.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5172

Search results for: Information Processing.

3912 Virtualizing Attendance and Reducing Impacts on the Environment with a Mobile Application

Authors: Paulo R. M. de Andrade, Adriano B. Albuquerque, Otávio F. Frota, Robson V. Silveira, Fátima A. da Silva

Abstract:

Information technology has been gaining more and more space whether in industry, commerce or even for personal use, but the misuse of it brings harm to the environment and human health as a result. Contribute to the sustainability of the planet is to compensate the environment, all or part of what withdraws it. The green computing also came to propose practical for use in IT in an environmentally correct way in aid of strategic management and communication. This work focuses on showing how a mobile application can help businesses reduce costs and reduced environmental impacts caused by its processes, through a case study of a public company in Brazil.

Keywords: E-government, green computing, information technology, mobile computing, sustainable development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
3911 Corporate Governance Practices and Analysts Forecast Accuracy Evidence for Romania

Authors: M. Ionascu, L. Olimid

Abstract:

In the last few years, several steps were taken in order to improve the quality of corporate governance for Romanian listed companies. Higher standards of corporate governance is documented in the literature to lead to a better information environment, and, consequently, to increase analysts forecast accuracy. Accordingly, the purpose of this paper is to investigate the extent to which corporate governance policies affect analysts forecasts for companies listed on Bucharest Stock Exchange. The results showed that there is indeed a negative correlation between a corporate governance index – used as a proxy for the quality of corporate governance practices - and analysts forecast errors.

Keywords: corporate governance, aanalysts' forecasts, information environment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
3910 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kr. Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with specific focus on Infrared (IR) and Visible image (VI) fusion for various applications including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like Visible camera & IR Thermal Imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (IR) that may be reflected or self-emitted. A digital color camera captures the visible source image and a thermal IR camera acquires the thermal source image. In this paper, some image fusion algorithms based upon Multi-Scale Transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, but they also make it hard to become deployed in system and applications that require real-time operation, high flexibility and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: Image fusion, IR thermal imager, multi-sensor, Multi-Scale Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 439
3909 64 bit Computer Architectures for Space Applications – A study

Authors: Niveditha Domse, Kris Kumar, K. N. Balasubramanya Murthy

Abstract:

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

Keywords: RISC MicroProcessor, RPC – RISC Processor Core, PBX – Processor to Block Interface part of the Interconnection Network, BPX – Block to Processor Interface part of the Interconnection Network, FPU – Floating Point Unit, SPU – Signal Processing Unit, WB – Wishbone Interface, CTU – Clock and Test Unit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2253
3908 Mammogram Image Size Reduction Using 16-8 bit Conversion Technique

Authors: Ayman A. AbuBaker, Rami S.Qahwaji, Musbah J. Aqel, Mohmmad H. Saleh

Abstract:

Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.

Keywords: Breast cancer, Image processing, Image reduction, Mammograms, Image enhancement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
3907 International Comparative Study of International Financial Reporting Standards Adoption and Earnings Quality: Effects of Differences in Accounting Standards, Industry Category, and Country Characteristics

Authors: Ichiro Mukai

Abstract:

The purpose of this study is to investigate whether firms applying International Financial Reporting Standards (IFRS), provide high-quality and comparable earnings information that is useful for decision making of information users relative to firms applying local Generally Accepted Accounting Principles (GAAP). Focus is placed on the earnings quality of listed firms in several developed countries: Australia, Canada, France, Germany, Japan, the United Kingdom (UK), and the United States (US). Except for Japan and the US, the adoption of IFRS is mandatory for listed firms in these countries. In Japan, the application of IFRS is allowed for specific listed firms. In the US, the foreign firms listed on the US securities market are permitted to apply IFRS but the listed domestic firms are prohibited from doing so. In this paper, the differences in earnings quality are compared between firms applying local GAAP and those applying IFRS in each country and industry category, and the reasons of differences in earnings quality are analyzed using various factors. The results show that, although the earnings quality of firms applying IFRS is higher than that of firms applying local GAAP, this varies with country and industry category. Thus, even if a single set of global accounting standards is used for all listed firms worldwide, it is difficult to establish comparability of financial information among global firms. These findings imply that various circumstances surrounding firms, industries, and countries etc. influence business operations and affect the differences in earnings quality.

Keywords: Accruals, earnings quality, IFRS, information comparability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 771
3906 Increased Capacity of Information Hiding in LSB-s Method for Text and Image

Authors: H.B.Kekre, Archana Athawale, Pallavi N.Halarnkar

Abstract:

Steganography, derived from Greek, literally means “covered writing". It includes a vast array of secret communications methods that conceal the message-s very existence. These methods include invisible inks, microdots, character arrangement, digital signatures, covert channels, and spread spectrum communications. This paper proposes a new improved version of Least Significant Bit (LSB) method. The approach proposed is simple for implementation when compared to Pixel value Differencing (PVD) method and yet achieves a High embedding capacity and imperceptibility. The proposed method can also be applied to 24 bit color images and achieve embedding capacity much higher than PVD.

Keywords: Information Hiding, LSB Matching, PVD Steganography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3172
3905 Transient Energy and its Impact on Transmission Line Faults

Authors: Mamta Patel, R. N. Patel

Abstract:

Transmission and distribution lines are vital links between the generating unit and consumers. They are exposed to atmosphere, hence chances of occurrence of fault in transmission line is very high which has to be immediately taken care of in order to minimize damage caused by it. In this paper Discrete wavelet transform of voltage signals at the two ends of transmission lines have been analyzed. The transient energy of the detail information of level five is calculated for different fault conditions. It is observed that the variation of transient energy of healthy and faulted line can give important information which can be very useful in classifying and locating the fault.

Keywords: Wavelet, Discrete wavelet transform, Multiresolution analysis, Transient energy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2442
3904 Contextual SenSe Model: Word Sense Disambiguation Using Sense and Sense Value of Context Surrounding the Target

Authors: Vishal Raj, Noorhan Abbas

Abstract:

Ambiguity in NLP (Natural Language Processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a method to create an affinity matrix to calculate the affinity between any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an algorithm to create the sense clusters of tokens using affinity matrix under hierarchy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contextual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and challenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.

Keywords: Word Sense Disambiguation, WSD, Contextual SenSe Model, Most Frequent Sense, part of speech, POS, Natural Language Processing, NLP, OOV, out of vocabulary, ELMo, Embeddings from Language Model, BERT, Bidirectional Encoder Representations from Transformers, Word2Vec, lemma_POS, Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 428
3903 Person Identification by Using AR Model for EEG Signals

Authors: Gelareh Mohammadi, Parisa Shoushtari, Behnam Molaee Ardekani, Mohammad B. Shamsollahi

Abstract:

A direct connection between ElectroEncephaloGram (EEG) and the genetic information of individuals has been investigated by neurophysiologists and psychiatrists since 1960-s; and it opens a new research area in the science. This paper focuses on the person identification based on feature extracted from the EEG which can show a direct connection between EEG and the genetic information of subjects. In this work the full EO EEG signal of healthy individuals are estimated by an autoregressive (AR) model and the AR parameters are extracted as features. Here for feature vector constitution, two methods have been proposed; in the first method the extracted parameters of each channel are used as a feature vector in the classification step which employs a competitive neural network and in the second method a combination of different channel parameters are used as a feature vector. Correct classification scores at the range of 80% to 100% reveal the potential of our approach for person classification/identification and are in agreement to the previous researches showing evidence that the EEG signal carries genetic information. The novelty of this work is in the combination of AR parameters and the network type (competitive network) that we have used. A comparison between the first and the second approach imply preference of the second one.

Keywords: Person Identification, Autoregressive Model, EEG, Neural Network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
3902 An Investigation of Customers’ Perception and Attitude towards Krung Thai Bank in Thailand

Authors: Phatthanan Chaiyabut

Abstract:

The purposes of this research were to identify the perception of customers towards Krung Thai Bank’s image and to understand the customer attitude towards Krung Thai Bank’s image in Bangkok, Thailand. This research utilized quantitative approach and used questionnaire as data collection tool. A sample size of 420 respondents was selected by simple random sampling. The findings revealed that the majority of respondents received information, news, and feeds concerning the bank through televisions the most. This information channel had significantly influenced on the customers and their decisions to utilize the bank’s products and services.

From the information concerning the attitudes towards overall image of the bank, it was found that the majority respondents rated the bank’s image at the good level. The top three average attitudes included the bank’s images in supports government's monetary policies, being renowned and stable, and contributing in economical amendments and developments, with the mean average of 4.01, 3.96 and 3.81 respectively. The attitudes toward the images included a business leader in banking, marketing, and competitions. Offering prompt services, and provided appropriate servicing time were rated moderate with the attitudes of 3.36 and 3.30 respectively.

Keywords: Attitude, Image, Krung Thai bank, Perception.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
3901 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 798
3900 Analysis of Complex Quadrature Mirror Filter Banks

Authors: Chimin Tsai

Abstract:

This work consists of three parts. First, the alias-free condition for the conventional two-channel quadrature mirror filter bank is analyzed using complex arithmetic. Second, the approach developed in the first part is applied to the complex quadrature mirror filter bank. Accordingly, the structure is simplified and the theory is easier to follow. Finally, a new class of complex quadrature mirror filter banks is proposed. Interesting properties of this new structure are also discussed.

Keywords: Aliasing cancellation, complex signal processing, polyphase realization, quadrature mirror filter banks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2282
3899 Dimensional Modeling of HIV Data Using Open Source

Authors: Charles D. Otine, Samuel B. Kucel, Lena Trojer

Abstract:

Selecting the data modeling technique for an information system is determined by the objective of the resultant data model. Dimensional modeling is the preferred modeling technique for data destined for data warehouses and data mining, presenting data models that ease analysis and queries which are in contrast with entity relationship modeling. The establishment of data warehouses as components of information system landscapes in many organizations has subsequently led to the development of dimensional modeling. This has been significantly more developed and reported for the commercial database management systems as compared to the open sources thereby making it less affordable for those in resource constrained settings. This paper presents dimensional modeling of HIV patient information using open source modeling tools. It aims to take advantage of the fact that the most affected regions by the HIV virus are also heavily resource constrained (sub-Saharan Africa) whereas having large quantities of HIV data. Two HIV data source systems were studied to identify appropriate dimensions and facts these were then modeled using two open source dimensional modeling tools. Use of open source would reduce the software costs for dimensional modeling and in turn make data warehousing and data mining more feasible even for those in resource constrained settings but with data available.

Keywords: About Database, Data Mining, Data warehouse, Dimensional Modeling, Open Source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
3898 Finding Equilibrium in Transport Networks by Simulation and Investigation of Behaviors

Authors: Gábor Szűcs, Gyula Sallai

Abstract:

The goal of this paper is to find Wardrop equilibrium in transport networks at case of uncertainty situations, where the uncertainty comes from lack of information. We use simulation tool to find the equilibrium, which gives only approximate solution, but this is sufficient for large networks as well. In order to take the uncertainty into account we have developed an interval-based procedure for finding the paths with minimal cost using the Dempster-Shafer theory. Furthermore we have investigated the users- behaviors using game theory approach, because their path choices influence the costs of the other users- paths.

Keywords: Dempster-Shafer theory, S-O and U-Otransportation network, uncertainty of information, Wardropequilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1536
3897 Forecasting Stock Price Manipulation in Capital Market

Authors: F. Rahnamay Roodposhti, M. Falah Shams, H. Kordlouie

Abstract:

The aim of the article is extending and developing econometrics and network structure based methods which are able to distinguish price manipulation in Tehran stock exchange. The principal goal of the present study is to offer model for approximating price manipulation in Tehran stock exchange. In order to do so by applying separation method a sample consisting of 397 companies accepted at Tehran stock exchange were selected and information related to their price and volume of trades during years 2001 until 2009 were collected and then through performing runs test, skewness test and duration correlative test the selected companies were divided into 2 sets of manipulated and non manipulated companies. In the next stage by investigating cumulative return process and volume of trades in manipulated companies, the date of starting price manipulation was specified and in this way the logit model, artificial neural network, multiple discriminant analysis and by using information related to size of company, clarity of information, ratio of P/E and liquidity of stock one year prior price manipulation; a model for forecasting price manipulation of stocks of companies present in Tehran stock exchange were designed. At the end the power of forecasting models were studied by using data of test set. Whereas the power of forecasting logit model for test set was 92.1%, for artificial neural network was 94.1% and multi audit analysis model was 90.2%; therefore all of the 3 aforesaid models has high power to forecast price manipulation and there is no considerable difference among forecasting power of these 3 models.

Keywords: Price Manipulation, Liquidity, Size of Company, Floating Stock, Information Clarity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2857
3896 Exploring More Productive Ways of Working

Authors: Jenna Ruostela, Antti Lönnqvist

Abstract:

New ways of working- refers to non-traditional work practices, settings and locations with information and communication technologies (ICT) to supplement or replace traditional ways of working. It questions the contemporary work practices and settings still very much used in knowledge-intensive organizations today. In this study new ways of working is seen to consist of two elements: work environment (incl. physical, virtual and social) and work practices. This study aims to gather the scattered information together and deepen the understanding on new ways of working. Moreover, the objective is to provide some evidence of the unclear productivity impacts of new ways of working using case study approach.

Keywords: Knowledge work, new ways of working, productivity, work environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
3895 Revisiting Distributed Protocols for Mobility at the Application Layer

Authors: N. Nouali, H. Drias, A. Doucet

Abstract:

During more than a decade, many proposals and standards have been designed to deal with the mobility issues; however, there are still some serious limitations in basing solutions on them. In this paper we discuss the possibility of handling mobility at the application layer. We do this while revisiting the conventional implementation of the Two Phase Commit (2PC) protocol which is a fundamental asset of transactional technology for ensuring the consistent commitment of distributed transactions. The solution is based on an execution framework providing an efficient extension that is aware of the mobility and preserves the 2PC principle.

Keywords: Application layer, distributed mobile protocols, mobility management, mobile transaction processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
3894 Exploration of Least Significant Bit Based Watermarking and Its Robustness against Salt and Pepper Noise

Authors: Kamaldeep Joshi, Rajkumar Yadav, Sachin Allwadhi

Abstract:

Image steganography is the best aspect of information hiding. In this, the information is hidden within an image and the image travels openly on the Internet. The Least Significant Bit (LSB) is one of the most popular methods of image steganography. In this method, the information bit is hidden at the LSB of the image pixel. In one bit LSB steganography method, the total numbers of the pixels and the total number of message bits are equal to each other. In this paper, the LSB method of image steganography is used for watermarking. The watermarking is an application of the steganography. The watermark contains 80*88 pixels and each pixel requirs 8 bits for its binary equivalent form so, the total number of bits required to hide the watermark are 80*88*8(56320). The experiment was performed on standard 256*256 and 512*512 size images. After the watermark insertion, histogram analysis was performed. A noise factor (salt and pepper) of 0.02 was added to the stego image in order to evaluate the robustness of the method. The watermark was successfully retrieved after insertion of noise. An experiment was performed in order to know the imperceptibility of stego and the retrieved watermark. It is clear that the LSB watermarking scheme is robust to the salt and pepper noise.

Keywords: LSB, watermarking, salt and pepper, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1055
3893 Algorithm of Measurement of Noise Signal Power in the Presence of Narrowband Interference

Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev

Abstract:

A power measurement algorithm of the input mix components of the noise signal and narrowband interference is considered using functional transformations of the input mix in the postdetection processing channel. The algorithm efficiency analysis has been carried out for different interference-to-signal ratio. Algorithm performance features have been explored by numerical experiment results.

Keywords: Noise signal, continuous narrowband interference, signal power, spectrum width, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403
3892 Comparison of Pore Space Features by Thin Sections and X-Ray Microtomography

Authors: H. Alves, J. T. Assis, M. Geraldes, I. Lima, R. T. Lopes

Abstract:

Microtomographic images and thin section (TS) images were analyzed and compared against some parameters of geological interest such as porosity and its distribution along the samples. The results show that microtomography (CT) analysis, although limited by its resolution, have some interesting information about the distribution of porosity (homogeneous or not) and can also quantify the connected and non-connected pores, i.e., total porosity. TS have no limitations concerning resolution, but are limited by the experimental data available in regards to a few glass sheets for analysis and also can give only information about the connected pores, i.e., effective porosity. Those two methods have their own virtues and flaws but when paired together they are able to complement one another, making for a more reliable and complete analysis.

Keywords: Microtomography, petrographical microscopy, sediments, thin sections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2332
3891 Low-complexity Integer Frequency Offset Synchronization for OFDMA System

Authors: Young-Jae Kim, Young-Hwan You

Abstract:

This paper presents a integer frequency offset (IFO) estimation scheme for the 3GPP long term evolution (LTE) downlink system. Firstly, the conventional joint detection method for IFO and sector cell index (CID) information is introduced. Secondly, an IFO estimation without explicit sector CID information is proposed, which can operate jointly with the proposed IFO estimation and reduce the time delay in comparison with the conventional joint method. Also, the proposed method is computationally efficient and has almost similar performance in comparison with the conventional method over the Pedestrian and Vehicular channel models.

Keywords: LTE, OFDMA, primary synchronization signal (PSS), IFO, CID

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2317
3890 Study of Syntactic Errors for Deep Parsing at Machine Translation

Authors: Yukiko Sasaki Alam, Shahid Alam

Abstract:

Syntactic parsing is vital for semantic treatment by many applications related to natural language processing (NLP), because form and content coincide in many cases. However, it has not yet reached the levels of reliable performance. By manually examining and analyzing individual machine translation output errors that involve syntax as well as semantics, this study attempts to discover what is required for improving syntactic and semantic parsing.

Keywords: Machine translation, error analysis, syntactic errors, knowledge required for parsing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1254
3889 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper

Abstract:

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Keywords: Additive manufacturing, lean production, reproducibility, work safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 854
3888 A New Method of Adaptation in Integrated Learning Environment

Authors: Ildar Galeev, Renat Mustaphin, C. Ardil

Abstract:

A new method of adaptation in a partially integrated learning environment that includes electronic textbook (ET) and integrated tutoring system (ITS) is described. The algorithm of adaptation is described in detail. It includes: establishment of Interconnections of operations and concepts; estimate of the concept mastering level (for all concepts); estimate of student-s non-mastering level on the current learning step of information on each page of ET; creation of a rank-order list of links to the e-manual pages containing information that require repeated work.

Keywords: Adaptation, Integrated Learning Environment, Integrated Tutoring System, Electronic Textbook.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1474
3887 Dynamics In Production Processes

Authors: Marco Kennemann, Steffen C. Eickemeyer, Peter Nyhuis

Abstract:

An increasingly dynamic and complex environment poses huge challenges to production enterprises, especially with regards to logistics. The Logistic Operating Curve Theory, developed at the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover, is a recognized approach to describing logistic interactions, nevertheless, it reaches its limits when it comes to the dynamic aspects. In order to facilitate a timely and optimal Logistic Positioning a method is developed for quickly and reliably identifying dynamic processing states.

Keywords: Dynamics, Logistic Operating Curves, Production Logistics, Production Planning and Control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1514
3886 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis

Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan

Abstract:

Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of big data technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centres or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through VADER and RoBERTa model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and Term Frequency – Inverse Document Frequency (TFIDF) Vectorization and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide if the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.

Keywords: Counter vectorization, Convolutional Neural Network, Crawler, data technology, Long Short-Term Memory, LSTM, Web Scraping, sentiment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188
3885 Block-Based 2D to 3D Image Conversion Method

Authors: S. Sowmyayani, V. Murugan

Abstract:

With the advent of three-dimension (3D) technology, there are lots of research in converting 2D images to 3D images. The main difference between 2D and 3D is the visual illusion of depth in 3D images. In the recent era, there are more depth estimation techniques. The objective of this paper is to convert 2D images to 3D images with less computation time. For this, the input image is divided into blocks from which the depth information is obtained. Having the depth information, a depth map is generated. Then the 3D image is warped using the original image and the depth map. The proposed method is tested on Make3D dataset and NYU-V2 dataset. The experimental results are compared with other recent methods. The proposed method proved to work with less computation time and good accuracy.

Keywords: Depth map, 3D image warping, image rendering, bilateral filter, minimum spanning tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 366
3884 Preventing and Coping Strategies for Cyber Bullying and Cyber Victimization

Authors: Erdinc Ozturk, Gizem Akcan

Abstract:

Although there are several advantages of information and communication technologies, they cause some problems like cyber bullying and cyber victimization. Cyber bullying and cyber victimization have lots of negative effects on people. There are lots of different strategies to prevent cyber bullying and victimization. This study was conducted to provide information about the strategies that are used to prevent cyber bullying and cyber victimization. 120 (60 women, 60 men) university students whose ages are between 18 and 35 participated this study. According to findings of this study, men are more prone to cyber bullying than women. Moreover, men are also more prone to cyber victimization than women.

Keywords: Cyber bullying, cyber victimization, coping strategies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
3883 Information Transmission between Large and Small Stocks in the Korean Stock Market

Authors: Sang Hoon Kang, Seong-Min Yoon

Abstract:

Little attention has been paid to information transmission between the portfolios of large stocks and small stocks in the Korean stock market. This study investigates the return and volatility transmission mechanisms between large and small stocks in the Korea Exchange (KRX). This study also explores whether bad news in the large stock market leads to a volatility of the small stock market that is larger than the good news volatility of the large stock market. By employing the Granger causality test, we found unidirectional return transmissions from the large stocks to medium and small stocks. This evidence indicates that pat information about the large stocks has a better ability to predict the returns of the medium and small stocks in the Korean stock market. Moreover, by using the asymmetric GARCH-BEKK model, we observed the unidirectional relationship of asymmetric volatility transmission from large stocks to the medium and small stocks. This finding suggests that volatility in the medium and small stocks following a negative shock in the large stocks is larger than that following a positive shock in the large stocks.

Keywords: Asymmetric GARCH-BEKK model, Asymmetric volatility transmission, Causality, Korean stock market, Spillover effect

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677