Search results for: quantum information
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4061

Search results for: quantum information

3281 Evaluating Service Quality of Online Auction by Fuzzy MCDM

Authors: Wei-Hsuan Lee, Chien-Hua Wang, Chin-Tzong Pang

Abstract:

This paper applies fuzzy set theory to evaluate the service quality of online auction. Service quality is a composition of various criteria. Among them many intangible attributes are difficult to measure. This characteristic introduces the obstacles for respondent in replying to the survey. So as to overcome this problem, we invite fuzzy set theory into the measurement of performance. By using AHP in obtaining criteria and TOPSIS in ranking, we found the most concerned dimension of service quality is Transaction Safety Mechanism and the least is Charge Item. Regarding to the most concerned attributes are information security, accuracy and information.

Keywords: AHP, Fuzzy set theory, TOPSIS, Online auction, Servicequality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
3280 Virtualizing Attendance and Reducing Impacts on the Environment with a Mobile Application

Authors: Paulo R. M. de Andrade, Adriano B. Albuquerque, Otávio F. Frota, Robson V. Silveira, Fátima A. da Silva

Abstract:

Information technology has been gaining more and more space whether in industry, commerce or even for personal use, but the misuse of it brings harm to the environment and human health as a result. Contribute to the sustainability of the planet is to compensate the environment, all or part of what withdraws it. The green computing also came to propose practical for use in IT in an environmentally correct way in aid of strategic management and communication. This work focuses on showing how a mobile application can help businesses reduce costs and reduced environmental impacts caused by its processes, through a case study of a public company in Brazil.

Keywords: E-government, green computing, information technology, mobile computing, sustainable development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641
3279 Corporate Governance Practices and Analysts Forecast Accuracy Evidence for Romania

Authors: M. Ionascu, L. Olimid

Abstract:

In the last few years, several steps were taken in order to improve the quality of corporate governance for Romanian listed companies. Higher standards of corporate governance is documented in the literature to lead to a better information environment, and, consequently, to increase analysts forecast accuracy. Accordingly, the purpose of this paper is to investigate the extent to which corporate governance policies affect analysts forecasts for companies listed on Bucharest Stock Exchange. The results showed that there is indeed a negative correlation between a corporate governance index – used as a proxy for the quality of corporate governance practices - and analysts forecast errors.

Keywords: corporate governance, aanalysts' forecasts, information environment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1455
3278 International Comparative Study of International Financial Reporting Standards Adoption and Earnings Quality: Effects of Differences in Accounting Standards, Industry Category, and Country Characteristics

Authors: Ichiro Mukai

Abstract:

The purpose of this study is to investigate whether firms applying International Financial Reporting Standards (IFRS), provide high-quality and comparable earnings information that is useful for decision making of information users relative to firms applying local Generally Accepted Accounting Principles (GAAP). Focus is placed on the earnings quality of listed firms in several developed countries: Australia, Canada, France, Germany, Japan, the United Kingdom (UK), and the United States (US). Except for Japan and the US, the adoption of IFRS is mandatory for listed firms in these countries. In Japan, the application of IFRS is allowed for specific listed firms. In the US, the foreign firms listed on the US securities market are permitted to apply IFRS but the listed domestic firms are prohibited from doing so. In this paper, the differences in earnings quality are compared between firms applying local GAAP and those applying IFRS in each country and industry category, and the reasons of differences in earnings quality are analyzed using various factors. The results show that, although the earnings quality of firms applying IFRS is higher than that of firms applying local GAAP, this varies with country and industry category. Thus, even if a single set of global accounting standards is used for all listed firms worldwide, it is difficult to establish comparability of financial information among global firms. These findings imply that various circumstances surrounding firms, industries, and countries etc. influence business operations and affect the differences in earnings quality.

Keywords: Accruals, earnings quality, IFRS, information comparability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 759
3277 Regional Medical Imaging System

Authors: Michal Javornik, Otto Dostal, Karel Slavicek

Abstract:

The purpose of this article is to introduce an advanced system for the support of processing of medical image information, and the terminology related to this system, which can be an important element to a faster transition to a fully digitalized hospital. The core of the system is a set of DICOM compliant applications running over a dedicated computer network. The whole integrated system creates a collaborative platform supporting daily routines in the radiology community, developing communication channels, supporting the exchange of information and special consultations among various medical institutions as well as supporting medical training for practicing radiologists and medical students. It gives the users outside of hospitals the tools to work in almost the same conditions as in the radiology departments.

Keywords: DICOM, Integration, Medical Education, MedicalImaging

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
3276 Increased Capacity of Information Hiding in LSB-s Method for Text and Image

Authors: H.B.Kekre, Archana Athawale, Pallavi N.Halarnkar

Abstract:

Steganography, derived from Greek, literally means “covered writing". It includes a vast array of secret communications methods that conceal the message-s very existence. These methods include invisible inks, microdots, character arrangement, digital signatures, covert channels, and spread spectrum communications. This paper proposes a new improved version of Least Significant Bit (LSB) method. The approach proposed is simple for implementation when compared to Pixel value Differencing (PVD) method and yet achieves a High embedding capacity and imperceptibility. The proposed method can also be applied to 24 bit color images and achieve embedding capacity much higher than PVD.

Keywords: Information Hiding, LSB Matching, PVD Steganography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3157
3275 Design and Testing of Nanotechnology Based Sequential Circuits Using MX-CQCA Logic in VHDL

Authors: K. Maria Agnes, J. Joshua Bapu

Abstract:

This paper impart the design and testing of Nanotechnology based sequential circuits using multiplexer conservative QCA (MX-CQCA) logic gates, which is easily testable using only two vectors. This method has great prospective in the design of sequential circuits based on reversible conservative logic gates and also smashes the sequential circuits implemented in traditional gates in terms of testability. Reversible circuits are similar to usual logic circuits except that they are built from reversible gates. Designs of multiplexer conservative QCA logic based two vectors testable double edge triggered (DET) sequential circuits in VHDL language are also accessible here; it will also diminish intricacy in testing side. Also other types of sequential circuits such as D, SR, JK latches are designed using this MX-CQCA logic gate. The objective behind the proposed design methodologies is to amalgamate arithmetic and logic functional units optimizing key metrics such as garbage outputs, delay, area and power. The projected MX-CQCA gate outshines other reversible gates in terms of the intricacy, delay.

Keywords: Conservative logic, Double edge triggered (DET) flip flop, majority voters, MX-CQCA gate, reversible logic, Quantum dot Cellular automata.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2285
3274 Transient Energy and its Impact on Transmission Line Faults

Authors: Mamta Patel, R. N. Patel

Abstract:

Transmission and distribution lines are vital links between the generating unit and consumers. They are exposed to atmosphere, hence chances of occurrence of fault in transmission line is very high which has to be immediately taken care of in order to minimize damage caused by it. In this paper Discrete wavelet transform of voltage signals at the two ends of transmission lines have been analyzed. The transient energy of the detail information of level five is calculated for different fault conditions. It is observed that the variation of transient energy of healthy and faulted line can give important information which can be very useful in classifying and locating the fault.

Keywords: Wavelet, Discrete wavelet transform, Multiresolution analysis, Transient energy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2426
3273 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, these projects propose AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project present the best-in-school techniques used to preserve data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptography techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures, and identifies potential correction/mitigation measures.

Keywords: Data privacy, artificial intelligence, healthcare AI, data sharing, healthcare organizations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72
3272 Person Identification by Using AR Model for EEG Signals

Authors: Gelareh Mohammadi, Parisa Shoushtari, Behnam Molaee Ardekani, Mohammad B. Shamsollahi

Abstract:

A direct connection between ElectroEncephaloGram (EEG) and the genetic information of individuals has been investigated by neurophysiologists and psychiatrists since 1960-s; and it opens a new research area in the science. This paper focuses on the person identification based on feature extracted from the EEG which can show a direct connection between EEG and the genetic information of subjects. In this work the full EO EEG signal of healthy individuals are estimated by an autoregressive (AR) model and the AR parameters are extracted as features. Here for feature vector constitution, two methods have been proposed; in the first method the extracted parameters of each channel are used as a feature vector in the classification step which employs a competitive neural network and in the second method a combination of different channel parameters are used as a feature vector. Correct classification scores at the range of 80% to 100% reveal the potential of our approach for person classification/identification and are in agreement to the previous researches showing evidence that the EEG signal carries genetic information. The novelty of this work is in the combination of AR parameters and the network type (competitive network) that we have used. A comparison between the first and the second approach imply preference of the second one.

Keywords: Person Identification, Autoregressive Model, EEG, Neural Network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
3271 An Investigation of Customers’ Perception and Attitude towards Krung Thai Bank in Thailand

Authors: Phatthanan Chaiyabut

Abstract:

The purposes of this research were to identify the perception of customers towards Krung Thai Bank’s image and to understand the customer attitude towards Krung Thai Bank’s image in Bangkok, Thailand. This research utilized quantitative approach and used questionnaire as data collection tool. A sample size of 420 respondents was selected by simple random sampling. The findings revealed that the majority of respondents received information, news, and feeds concerning the bank through televisions the most. This information channel had significantly influenced on the customers and their decisions to utilize the bank’s products and services.

From the information concerning the attitudes towards overall image of the bank, it was found that the majority respondents rated the bank’s image at the good level. The top three average attitudes included the bank’s images in supports government's monetary policies, being renowned and stable, and contributing in economical amendments and developments, with the mean average of 4.01, 3.96 and 3.81 respectively. The attitudes toward the images included a business leader in banking, marketing, and competitions. Offering prompt services, and provided appropriate servicing time were rated moderate with the attitudes of 3.36 and 3.30 respectively.

Keywords: Attitude, Image, Krung Thai bank, Perception.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
3270 Q-Map: Clinical Concept Mining from Clinical Documents

Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala

Abstract:

Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.

Keywords: Information retrieval (IR), unified medical language system (UMLS), Syntax Based Analysis, natural language processing (NLP), medical informatics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 769
3269 Dimensional Modeling of HIV Data Using Open Source

Authors: Charles D. Otine, Samuel B. Kucel, Lena Trojer

Abstract:

Selecting the data modeling technique for an information system is determined by the objective of the resultant data model. Dimensional modeling is the preferred modeling technique for data destined for data warehouses and data mining, presenting data models that ease analysis and queries which are in contrast with entity relationship modeling. The establishment of data warehouses as components of information system landscapes in many organizations has subsequently led to the development of dimensional modeling. This has been significantly more developed and reported for the commercial database management systems as compared to the open sources thereby making it less affordable for those in resource constrained settings. This paper presents dimensional modeling of HIV patient information using open source modeling tools. It aims to take advantage of the fact that the most affected regions by the HIV virus are also heavily resource constrained (sub-Saharan Africa) whereas having large quantities of HIV data. Two HIV data source systems were studied to identify appropriate dimensions and facts these were then modeled using two open source dimensional modeling tools. Use of open source would reduce the software costs for dimensional modeling and in turn make data warehousing and data mining more feasible even for those in resource constrained settings but with data available.

Keywords: About Database, Data Mining, Data warehouse, Dimensional Modeling, Open Source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1947
3268 Finding Equilibrium in Transport Networks by Simulation and Investigation of Behaviors

Authors: Gábor Szűcs, Gyula Sallai

Abstract:

The goal of this paper is to find Wardrop equilibrium in transport networks at case of uncertainty situations, where the uncertainty comes from lack of information. We use simulation tool to find the equilibrium, which gives only approximate solution, but this is sufficient for large networks as well. In order to take the uncertainty into account we have developed an interval-based procedure for finding the paths with minimal cost using the Dempster-Shafer theory. Furthermore we have investigated the users- behaviors using game theory approach, because their path choices influence the costs of the other users- paths.

Keywords: Dempster-Shafer theory, S-O and U-Otransportation network, uncertainty of information, Wardropequilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1524
3267 Forecasting Stock Price Manipulation in Capital Market

Authors: F. Rahnamay Roodposhti, M. Falah Shams, H. Kordlouie

Abstract:

The aim of the article is extending and developing econometrics and network structure based methods which are able to distinguish price manipulation in Tehran stock exchange. The principal goal of the present study is to offer model for approximating price manipulation in Tehran stock exchange. In order to do so by applying separation method a sample consisting of 397 companies accepted at Tehran stock exchange were selected and information related to their price and volume of trades during years 2001 until 2009 were collected and then through performing runs test, skewness test and duration correlative test the selected companies were divided into 2 sets of manipulated and non manipulated companies. In the next stage by investigating cumulative return process and volume of trades in manipulated companies, the date of starting price manipulation was specified and in this way the logit model, artificial neural network, multiple discriminant analysis and by using information related to size of company, clarity of information, ratio of P/E and liquidity of stock one year prior price manipulation; a model for forecasting price manipulation of stocks of companies present in Tehran stock exchange were designed. At the end the power of forecasting models were studied by using data of test set. Whereas the power of forecasting logit model for test set was 92.1%, for artificial neural network was 94.1% and multi audit analysis model was 90.2%; therefore all of the 3 aforesaid models has high power to forecast price manipulation and there is no considerable difference among forecasting power of these 3 models.

Keywords: Price Manipulation, Liquidity, Size of Company, Floating Stock, Information Clarity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2843
3266 Exploring More Productive Ways of Working

Authors: Jenna Ruostela, Antti Lönnqvist

Abstract:

New ways of working- refers to non-traditional work practices, settings and locations with information and communication technologies (ICT) to supplement or replace traditional ways of working. It questions the contemporary work practices and settings still very much used in knowledge-intensive organizations today. In this study new ways of working is seen to consist of two elements: work environment (incl. physical, virtual and social) and work practices. This study aims to gather the scattered information together and deepen the understanding on new ways of working. Moreover, the objective is to provide some evidence of the unclear productivity impacts of new ways of working using case study approach.

Keywords: Knowledge work, new ways of working, productivity, work environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172
3265 Exploration of Least Significant Bit Based Watermarking and Its Robustness against Salt and Pepper Noise

Authors: Kamaldeep Joshi, Rajkumar Yadav, Sachin Allwadhi

Abstract:

Image steganography is the best aspect of information hiding. In this, the information is hidden within an image and the image travels openly on the Internet. The Least Significant Bit (LSB) is one of the most popular methods of image steganography. In this method, the information bit is hidden at the LSB of the image pixel. In one bit LSB steganography method, the total numbers of the pixels and the total number of message bits are equal to each other. In this paper, the LSB method of image steganography is used for watermarking. The watermarking is an application of the steganography. The watermark contains 80*88 pixels and each pixel requirs 8 bits for its binary equivalent form so, the total number of bits required to hide the watermark are 80*88*8(56320). The experiment was performed on standard 256*256 and 512*512 size images. After the watermark insertion, histogram analysis was performed. A noise factor (salt and pepper) of 0.02 was added to the stego image in order to evaluate the robustness of the method. The watermark was successfully retrieved after insertion of noise. An experiment was performed in order to know the imperceptibility of stego and the retrieved watermark. It is clear that the LSB watermarking scheme is robust to the salt and pepper noise.

Keywords: LSB, watermarking, salt and pepper, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1044
3264 Comparison of Pore Space Features by Thin Sections and X-Ray Microtomography

Authors: H. Alves, J. T. Assis, M. Geraldes, I. Lima, R. T. Lopes

Abstract:

Microtomographic images and thin section (TS) images were analyzed and compared against some parameters of geological interest such as porosity and its distribution along the samples. The results show that microtomography (CT) analysis, although limited by its resolution, have some interesting information about the distribution of porosity (homogeneous or not) and can also quantify the connected and non-connected pores, i.e., total porosity. TS have no limitations concerning resolution, but are limited by the experimental data available in regards to a few glass sheets for analysis and also can give only information about the connected pores, i.e., effective porosity. Those two methods have their own virtues and flaws but when paired together they are able to complement one another, making for a more reliable and complete analysis.

Keywords: Microtomography, petrographical microscopy, sediments, thin sections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2322
3263 Smart and Connected Aircraft Cabin: A Balancing Act between Operational Cabin Management, Airline Business and Passenger Expectations

Authors: Ralf God, Lothar Kerschgens, Leonardo Goratti, Steven Lemaire

Abstract:

Ubiquitous connectivity is a reality and a basic need for users on ground. Air travel connectivity in the cabin is also becoming increasingly important for passengers during cabin use. Wireless sensor networks that provide information to cabin management systems are being used by airlines to optimize cabin crew workload. In networked cabin systems, communications and digitally transmitted data must be managed by airlines in every direction. Security and privacy, information processing and knowledge management are the current and future requirements for a smart and connected cabin.

Keywords: Smart and connected cabin management, Internet of Things, power management, airline business.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 414
3262 Low-complexity Integer Frequency Offset Synchronization for OFDMA System

Authors: Young-Jae Kim, Young-Hwan You

Abstract:

This paper presents a integer frequency offset (IFO) estimation scheme for the 3GPP long term evolution (LTE) downlink system. Firstly, the conventional joint detection method for IFO and sector cell index (CID) information is introduced. Secondly, an IFO estimation without explicit sector CID information is proposed, which can operate jointly with the proposed IFO estimation and reduce the time delay in comparison with the conventional joint method. Also, the proposed method is computationally efficient and has almost similar performance in comparison with the conventional method over the Pedestrian and Vehicular channel models.

Keywords: LTE, OFDMA, primary synchronization signal (PSS), IFO, CID

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2305
3261 A New Method of Adaptation in Integrated Learning Environment

Authors: Ildar Galeev, Renat Mustaphin, C. Ardil

Abstract:

A new method of adaptation in a partially integrated learning environment that includes electronic textbook (ET) and integrated tutoring system (ITS) is described. The algorithm of adaptation is described in detail. It includes: establishment of Interconnections of operations and concepts; estimate of the concept mastering level (for all concepts); estimate of student-s non-mastering level on the current learning step of information on each page of ET; creation of a rank-order list of links to the e-manual pages containing information that require repeated work.

Keywords: Adaptation, Integrated Learning Environment, Integrated Tutoring System, Electronic Textbook.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461
3260 Detecting Fake News: A Natural Language Processing, Reinforcement Learning, and Blockchain Approach

Authors: Ashly Joseph, Jithu Paulose

Abstract:

In an era where misleading information may quickly circulate on digital news channels, it is crucial to have efficient and trustworthy methods to detect and reduce the impact of misinformation. This research proposes an innovative framework that combines Natural Language Processing (NLP), Reinforcement Learning (RL), and Blockchain technologies to precisely detect and minimize the spread of false information in news articles on social media. The framework starts by gathering a variety of news items from different social media sites and performing preprocessing on the data to ensure its quality and uniformity. NLP methods are utilized to extract complete linguistic and semantic characteristics, effectively capturing the subtleties and contextual aspects of the language used. These features are utilized as input for a RL model. This model acquires the most effective tactics for detecting and mitigating the impact of false material by modeling the intricate dynamics of user engagements and incentives on social media platforms. The integration of blockchain technology establishes a decentralized and transparent method for storing and verifying the accuracy of information. The Blockchain component guarantees the unchangeability and safety of verified news records, while encouraging user engagement for detecting and fighting false information through an incentive system based on tokens. The suggested framework seeks to provide a thorough and resilient solution to the problems presented by misinformation in social media articles.

Keywords: Natural Language Processing, Reinforcement Learning, Blockchain, fake news mitigation, misinformation detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49
3259 Block-Based 2D to 3D Image Conversion Method

Authors: S. Sowmyayani, V. Murugan

Abstract:

With the advent of three-dimension (3D) technology, there are lots of research in converting 2D images to 3D images. The main difference between 2D and 3D is the visual illusion of depth in 3D images. In the recent era, there are more depth estimation techniques. The objective of this paper is to convert 2D images to 3D images with less computation time. For this, the input image is divided into blocks from which the depth information is obtained. Having the depth information, a depth map is generated. Then the 3D image is warped using the original image and the depth map. The proposed method is tested on Make3D dataset and NYU-V2 dataset. The experimental results are compared with other recent methods. The proposed method proved to work with less computation time and good accuracy.

Keywords: Depth map, 3D image warping, image rendering, bilateral filter, minimum spanning tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 343
3258 Preventing and Coping Strategies for Cyber Bullying and Cyber Victimization

Authors: Erdinc Ozturk, Gizem Akcan

Abstract:

Although there are several advantages of information and communication technologies, they cause some problems like cyber bullying and cyber victimization. Cyber bullying and cyber victimization have lots of negative effects on people. There are lots of different strategies to prevent cyber bullying and victimization. This study was conducted to provide information about the strategies that are used to prevent cyber bullying and cyber victimization. 120 (60 women, 60 men) university students whose ages are between 18 and 35 participated this study. According to findings of this study, men are more prone to cyber bullying than women. Moreover, men are also more prone to cyber victimization than women.

Keywords: Cyber bullying, cyber victimization, coping strategies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
3257 Information Transmission between Large and Small Stocks in the Korean Stock Market

Authors: Sang Hoon Kang, Seong-Min Yoon

Abstract:

Little attention has been paid to information transmission between the portfolios of large stocks and small stocks in the Korean stock market. This study investigates the return and volatility transmission mechanisms between large and small stocks in the Korea Exchange (KRX). This study also explores whether bad news in the large stock market leads to a volatility of the small stock market that is larger than the good news volatility of the large stock market. By employing the Granger causality test, we found unidirectional return transmissions from the large stocks to medium and small stocks. This evidence indicates that pat information about the large stocks has a better ability to predict the returns of the medium and small stocks in the Korean stock market. Moreover, by using the asymmetric GARCH-BEKK model, we observed the unidirectional relationship of asymmetric volatility transmission from large stocks to the medium and small stocks. This finding suggests that volatility in the medium and small stocks following a negative shock in the large stocks is larger than that following a positive shock in the large stocks.

Keywords: Asymmetric GARCH-BEKK model, Asymmetric volatility transmission, Causality, Korean stock market, Spillover effect

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1663
3256 MovieReco: A Recommendation System

Authors: Dipankaj G Medhi, Juri Dakua

Abstract:

Recommender Systems act as personalized decision guides, aiding users in decisions on matters related to personal taste. Most previous research on Recommender Systems has focused on the statistical accuracy of the algorithms driving the systems, with no emphasis on the trustworthiness of the user. RS depends on information provided by different users to gather its knowledge. We believe, if a large group of users provide wrong information it will not be possible for the RS to arrive in an accurate conclusion. The system described in this paper introduce the concept of Testing the knowledge of user to filter out these “bad users". This paper emphasizes on the mechanism used to provide robust and effective recommendation.

Keywords: Collaborative Filtering, Content Based Filtering, Intelligent Agent, Level of Interest, Recommendation System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635
3255 Physical Activity and Cognitive Functioning Relationship in Children

Authors: Comfort Mokgothu

Abstract:

This study investigated the relation between processing information and fitness level of active (fit) and sedentary (unfit) children drawn from rural and urban areas in Botswana. It was hypothesized that fit children would display faster simple reaction time (SRT), choice reaction times (CRT) and movement times (SMT). 60, third grade children (7.0 – 9.0 years) were initially selected and based upon fitness testing, 45 participated in the study (15 each of fit urban, unfit urban, fit rural). All children completed anthropometric measures, skinfold testing and submaximal cycle ergometer testing. The cognitive testing included SRT, CRT, SMT and Choice Movement Time (CMT) and memory sequence length. Results indicated that the rural fit group exhibited faster SMT than the urban fit and unfit groups. For CRT, both fit groups were faster than the unfit group. Collectively, the study shows that the relationship that exists between physical fitness and cognitive function amongst the elderly can tentatively be extended to the pediatric population. Physical fitness could be a factor in the speed at which we process information, including decision making, even in children.

Keywords: Decision making, fitness, information processing, reaction time, cognition movement time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783
3254 Risk Based Building Information Modeling (BIM) for Urban Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Building Information Modeling (BIM) is a holistic documentation process for operational visualization, design coordination, estimation and project scheduling. BIM software defines objects parametrically and it is a tool for virtual reality. Primary advantage of implementing BIM is the visual coordination of the building structure and systems such as Mechanical, Electrical and Plumbing (MEP) and it also identifies the possible conflicts between the building systems. This paper is an attempt to develop a risk based BIM model which would highlight the primary advantages of application of BIM pertaining to urban infrastructure transportation project. It has been observed that about 40% of the Architecture, Engineering and Construction (AEC) companies use BIM but primarily for their outsourced projects. Also, 65% of the respondents agree that BIM would be used quiet strongly for future construction projects in India. The 3D models developed with Revit 2015 software would reduce co-ordination problems amongst the architects, structural engineers, contractors and building service providers (MEP). Integration of risk management along with BIM would provide enhanced co-ordination, collaboration and high probability of successful completion of the complex infrastructure transportation project within stipulated time and cost frame.

Keywords: Building information modeling (BIM), infrastructure transportation, project risk management, underground metro rail.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2119
3253 An Approach for a Bidding Process Knowledge Capitalization

Authors: R. Chalal, A. R. Ghomari

Abstract:

Preparation and negotiation of innovative and future projects can be characterized as a strategic-type decision situation, involving many uncertainties and an unpredictable environment. We will focus in this paper on the bidding process. It includes cooperative and strategic decisions. Our approach for bidding process knowledge capitalization is aimed at information management in project-oriented organizations, based on the MUSIC (Management and Use of Co-operative Information Systems) model. We will show how to capitalize the company strategic knowledge and also how to organize the corporate memory. The result of the adopted approach is improvement of corporate memory quality.

Keywords: Bidding process, corporate memory, Knowledge capitalization, knowledge acquisition, strategic decisions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
3252 Effective Charge Coupling in Low Dimensional Doped Quantum Antiferromagnets

Authors: Suraka Bhattacharjee, Ranjan Chaudhury

Abstract:

The interaction between the charge degrees of freedom for itinerant antiferromagnets is investigated in terms of generalized charge stiffness constant corresponding to nearest neighbour t-J model and t1-t2-t3-J model. The low dimensional hole doped antiferromagnets are the well known systems that can be described by the t-J-like models. Accordingly, we have used these models to investigate the fermionic pairing possibilities and the coupling between the itinerant charge degrees of freedom. A detailed comparison between spin and charge couplings highlights that the charge and spin couplings show very similar behaviour in the over-doped region, whereas, they show completely different trends in the lower doping regimes. Moreover, a qualitative equivalence between generalized charge stiffness and effective Coulomb interaction is also established based on the comparisons with other theoretical and experimental results. Thus it is obvious that the enhanced possibility of fermionic pairing is inherent in the reduction of Coulomb repulsion with increase in doping concentration. However, the increased possibility can not give rise to pairing without the presence of any other pair producing mechanism outside the t-J model. Therefore, one can conclude that the t-J-like models themselves solely are not capable of producing conventional momentum-based superconducting pairing on their own.

Keywords: Generalized charge stiffness constant, charge coupling, effective Coulomb interaction, t-J-like models, momentum-space pairing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 607