Search results for: information dissemination
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10730

Search results for: information dissemination

6890 Virtual Team Performance: A Transactive Memory System Perspective

Authors: Belbaly Nassim

Abstract:

Virtual teams (VT) initiatives, in which teams are geographically dispersed and communicate via modern computer-driven technologies, have attracted increasing attention from researchers and professionals. The growing need to examine how to balance and optimize VT is particularly important given the exposure experienced by companies when their employees encounter globalization and decentralization pressures to monitor VT performance. Hence, organization is regularly limited due to misalignment between the behavioral capabilities of the team’s dispersed competences and knowledge capabilities and how trust issues interplay and influence these VT dimensions and the effects of such exchanges. In fact, the future success of business depends on the extent to which VTs are managing efficiently their dispersed expertise, skills and knowledge to stimulate VT creativity. Transactive memory system (TMS) may enhance VT creativity using its three dimensons: knowledge specialization, credibility and knowledge coordination. TMS can be understood as a composition of both a structural component residing of individual knowledge and a set of communication processes among individuals. The individual knowledge is shared while being retrieved, applied and the learning is coordinated. TMS is driven by the central concept that the system is built on the distinction between internal and external memory encoding. A VT learns something new and catalogs it in memory for future retrieval and use. TMS uses the role of information technology to explain VT behaviors by offering VT members the possibility to encode, store, and retrieve information. TMS considers the members of a team as a processing system in which the location of expertise both enhances knowledge coordination and builds trust among members over time. We build on TMS dimensions to hypothesize the effects of specialization, coordination, and credibility on VT creativity. In fact, VTs consist of dispersed expertise, skills and knowledge that can positively enhance coordination and collaboration. Ultimately, this team composition may lead to recognition of both who has expertise and where that expertise is located; over time, the team composition may also build trust among VT members over time developing the ability to coordinate their knowledge which can stimulate creativity. We also assess the reciprocal relationship between TMS dimensions and VT creativity. We wish to use TMS to provide researchers with a theoretically driven model that is empirically validated through survey evidence. We propose that TMS provides a new way to enhance and balance VT creativity. This study also provides researchers insight into the use of TMS to influence positively VT creativity. In addition to our research contributions, we provide several managerial insights into how TMS components can be used to increase performance within dispersed VTs.

Keywords: virtual team creativity, transactive memory systems, specialization, credibility, coordination

Procedia PDF Downloads 149
6889 Citizen Participation in Smart Cities: Singapore and Tokyo

Authors: Thomas Benson

Abstract:

Smart cities have been heralded as multi-faceted entities which utilise information and communication technologies to enhance citizen participation. The purpose of this paper is to outline authoritative definitions of smart cities and citizen participation and investigate smart city citizen-centrism rhetoric by examining urban governance and citizen participation processes. Drawing on extant literature and official city government documents and websites, Singapore (Singapore) and Tokyo (Japan) are chosen as comparable smart city case studies. For the smart city to be truly realised, this paper concludes that smart cities must do more to incorporate genuine citizen participation mechanisms.

Keywords: citizen participation, smart cities, urban governance, Singapore, Tokyo

Procedia PDF Downloads 132
6888 Experiences of Patients Living with Peritoneal Dialysis: A Qualitative Study

Authors: Xuzhen Yang, Yan Shan, Yabo Ding, Keke DIao, Yanjun Zhang, Yijia Huang

Abstract:

Purpose: Our aim is to understand the unique experiences of patients with peritoneal dialysis and how they deal with issues brought on by disease and dialysis. Patients and Methods: Semi-structured interview was designed to collect information, and inpatients with peritoneal dialysis in a university-based tertiary hospital in the central province of China were purposively chosen as interviewees. The content analysis method was used to analyze the data. Results: Nine patients participated in the study, and three themes and eight subthemes were generated. Conclusion: Patients using peritoneal dialysis encounter numerous challenges and problems in the process of disease and dialysis, and they took attempt to cope with them well to adapt to living with peritoneal dialysis.

Keywords: peritoneal dialysis, experience, patient, coping strategy

Procedia PDF Downloads 79
6887 Analysis, Design, and Implementation of Quality Management System for KSA Software Company

Authors: Omar Said Almushyt

Abstract:

Quality management, in all countries all over the world, has become recently necessary to face challenges among companies. Software companies in KSA suffer from two problems, namely, low customer satisfaction, and low product quality. Implementation of quality management for a software company can solve these problems, by improving the quality of products and enhancing customer satisfaction. This will lead the company to be competitive. Introducing quality management system onto system analysis followed by system design and finally implementing that system can achieve these goals. Results of the present work showed that the proposed method can increase both the product quality by 10 % and the customer satisfaction by 20 %.

Keywords: quality, management, software, information engineering

Procedia PDF Downloads 421
6886 The Regulation of Alternative Dispute Resolution Institutions in Consumer Redress and Enforcement: A South African Perspective

Authors: Jacolien Barnard, Corlia Van Heerden

Abstract:

Effective and accessible consensual dispute resolution and in particular alternative dispute resolution, are central to consumer protection legislation. In this regard, the Consumer Protection Act 68 of 2008 (CPA) of South Africa is no exception. Due to the nature of consumer disputes, alternative dispute resolution (in theory) is an effective vehicle for the adjudication of disputes in a timely manner avoiding overburdening of the courts. The CPA sets down as one of its core purposes the provision of ‘an accessible, consistent, harmonized, effective and efficient system of redress for consumers’ (section 3(1)(h) of the CPA). Section 69 of the Act provides for the enforcement of consumer rights and provides for the National Consumer Commission to be the Central Authority which streamlines, adjudicates and channels disputes to the appropriate forums which include Alternative Dispute Resolution Agents (ADR-agents). The purpose of this paper is to analyze the regulation of these enforcement and redress mechanisms with particular focus on the Central Authority as well as the ADR-agents and their crucial role in successful and efficient adjudication of disputes in South Africa. The South African position will be discussed comparatively with the European Union (EU) position. In this regard, the European Union (EU) Directive on Alternative Dispute Resolution for Consumer Disputes (2013/11/EU) will be discussed (The ADR Directive). The aim of the ADR Directive is to solve contractual disputes between consumers and traders (suppliers or businesses) regardless of whether the agreement was concluded offline or online or whether or not the trader is situated in another member state (Recitals 4-6). The ADR Directive provides for a set of quality requirements that an ADR body or entity tasked with resolving consumer disputes should adhere to in member states which include regulatory mechanisms for control. Transparency, effectiveness, fairness, liberty and legality are all requirements for a successful ADR body and discussed within this chapter III of the Directive. Chapters III and IV govern the importance of information and co-operation. This includes information between ADR bodies and the European Commission (EC) but also between ADR bodies or entities and national authorities enforcing legal acts on consumer protection and traders. (In South Africa the National Consumer Tribunal, Provincial Consumer Protectors and Industry ombuds come to mind). All of which have a responsibility to keep consumers informed. Ultimately the papers aims to provide recommendations as to the successfulness of the current South African position in light of the comparative position in Europe and the highlight the importance of proper regulation of these redress and enforcement institutions.

Keywords: alternative dispute resolution, consumer protection law, enforcement, redress

Procedia PDF Downloads 209
6885 Different Data-Driven Bivariate Statistical Approaches to Landslide Susceptibility Mapping (Uzundere, Erzurum, Turkey)

Authors: Azimollah Aleshzadeh, Enver Vural Yavuz

Abstract:

The main goal of this study is to produce landslide susceptibility maps using different data-driven bivariate statistical approaches; namely, entropy weight method (EWM), evidence belief function (EBF), and information content model (ICM), at Uzundere county, Erzurum province, in the north-eastern part of Turkey. Past landslide occurrences were identified and mapped from an interpretation of high-resolution satellite images, and earlier reports as well as by carrying out field surveys. In total, 42 landslide incidence polygons were mapped using ArcGIS 10.4.1 software and randomly split into a construction dataset 70 % (30 landslide incidences) for building the EWM, EBF, and ICM models and the remaining 30 % (12 landslides incidences) were used for verification purposes. Twelve layers of landslide-predisposing parameters were prepared, including total surface radiation, maximum relief, soil groups, standard curvature, distance to stream/river sites, distance to the road network, surface roughness, land use pattern, engineering geological rock group, topographical elevation, the orientation of slope, and terrain slope gradient. The relationships between the landslide-predisposing parameters and the landslide inventory map were determined using different statistical models (EWM, EBF, and ICM). The model results were validated with landslide incidences, which were not used during the model construction. In addition, receiver operating characteristic curves were applied, and the area under the curve (AUC) was determined for the different susceptibility maps using the success (construction data) and prediction (verification data) rate curves. The results revealed that the AUC for success rates are 0.7055, 0.7221, and 0.7368, while the prediction rates are 0.6811, 0.6997, and 0.7105 for EWM, EBF, and ICM models, respectively. Consequently, landslide susceptibility maps were classified into five susceptibility classes, including very low, low, moderate, high, and very high. Additionally, the portion of construction and verification landslides incidences in high and very high landslide susceptibility classes in each map was determined. The results showed that the EWM, EBF, and ICM models produced satisfactory accuracy. The obtained landslide susceptibility maps may be useful for future natural hazard mitigation studies and planning purposes for environmental protection.

Keywords: entropy weight method, evidence belief function, information content model, landslide susceptibility mapping

Procedia PDF Downloads 124
6884 A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation

Authors: Adriano Bessa Albuquerque, Francisco Jose Barreto Nunes

Abstract:

Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.

Keywords: software test, software security verification validation and test, security test institutionalization, systematic mapping study

Procedia PDF Downloads 385
6883 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 135
6882 Micro-CT Imaging Of Hard Tissues

Authors: Amir Davood Elmi

Abstract:

From the earliest light microscope to the most innovative X-ray imaging techniques, all of them have refined and improved our knowledge about the organization and composition of living tissues. The old techniques are time consuming and ultimately destructive to the tissues under the examination. In recent few decades, thanks to the boost of technology, non-destructive visualization techniques, such as X-ray computed tomography (CT), magnetic resonance imaging (MRI), selective plane illumination microscopy (SPIM), and optical projection tomography (OPT), have come to the forefront. Among these techniques, CT is excellent for mineralized tissues such as bone or dentine. In addition, CT it is faster than other aforementioned techniques and the sample remains intact. In this article, applications, advantages, and limitations of micro-CT is discussed, in addition to some information about micro-CT of soft tissue.

Keywords: Micro-CT, hard tissue, bone, attenuation coefficient, rapid prototyping

Procedia PDF Downloads 128
6881 The Student's Satisfaction toward Web Based Instruction on Puppet Show

Authors: Piyanut Suchit

Abstract:

The purposes of this study was to investigate students’ satisfaction learning with the web based instruction on the puppet show. The population of this study includes 53 students in the Program of Library and Information Sciences who registered in the subject of Puppet for Assisting Learning Development in semester 2/2011, Suansunandha Rajabhat University, Bangkok, Thailand. The research instruments consist of web based instruction on the puppet show, and questionnaires for students’ satisfaction. The research statistics includes arithmetic mean, and standard deviation. The results revealed that the students reported very high satisfaction with mean = 4.63, SD = 0.52, on the web based instruction.

Keywords: puppet show, web based instruction, satisfaction, Suansunandha Rajabhat University

Procedia PDF Downloads 376
6880 Teachers' Perceptions of Physical Education and Sports Calendar and Conducted in the Light of the Objective of the Lesson Approach Competencies

Authors: Chelali Mohammed

Abstract:

In the context of the application of the competency-based approach in the system educational Algeria, the price of physical education and sport must privilege the acquisition of learning approaches and especially the approach science, which from problem situations, research and develops him information processing and application of knowledge and know-how in new situations in the words of ‘JOHN DEWEY’ ‘learning by practice’. And to achieve these goals and make teaching more EPS motivating, consistent and concrete, it is appropriate to perform a pedagogical approach freed from the constraints and open to creativity and student-centered in the light of the competency approach adopted in the formal curriculum. This approach is not unusual, but we think it is a highly professional nature requires the competence of the teacher.

Keywords: approach competencies, physical, education, teachers

Procedia PDF Downloads 592
6879 Remote Wireless Communications Lab in Real Time

Authors: El Miloudi Djelloul

Abstract:

Technology nowadays enables the remote access to laboratory equipment and instruments via Internet. This is especially useful in engineering education, where students can conduct laboratory experiment remotely. Such remote laboratory access can enable student to use expensive laboratory equipment, which is not usually available to students. In this paper, we present a method of creating a Web-based Remote Laboratory Experimentation in the master degree course “Wireless Communications Systems” which is part of “ICS (Information and Communication Systems)” and “Investment Management in Telecommunications” curriculums. This is done within the RIPLECS Project and the NI2011 FF005 Research Project “Implementation of Project-Based Learning in an Interdisciplinary Master Program”.

Keywords: remote access, remote laboratory, wireless telecommunications, external antenna-switching controller board (EASCB)

Procedia PDF Downloads 499
6878 Road Traffic Noise Mapping for Riyadh City Using GIS and Lima

Authors: Khalid A. Alsaif, Mosaad A. Foda

Abstract:

The primary objective of this study is to develop the first round of road traffic noise maps for Riyadh City using Geographical Information Systems (GIS) and software LimA 7810 predictor. The road traffic data were measured or estimated as accurate as possible in order to obtain reliable noise maps. Meanwhile, the attributes of the roads and buildings are automatically exported from GIS. The simulation results at some chosen locations are validated by actual field measurements, which are obtained by a system that consists of a sound level meter, a GPS receiver and a database to manage the measured data. The results show that the average error between the predicted and measured noise levels is below 3.0 dB.

Keywords: noise pollution, road traffic noise, LimA predictor, GIS

Procedia PDF Downloads 387
6877 Ground States of Structure of Even ¹⁰⁴-¹⁰⁶ Ru Isotopes

Authors: I. Hossain, Huda H. Kassim, Fadhil I. Sharrad, Said A. Mansour

Abstract:

In this conference, we apply the interacting boson model-1 (IBM-1) formula for U(5) symmetry in order to calculate the energy levels and reduced transition probabilities for a few yrast transitions in Ru with neutron N=60, 62. The neutron rich even-even isotopes of Ru are very interesting to investigate using IBM-1, because even 104,106Ru isotopes are great consequence due to excited near the magic number 50. The calculation of ground state band and B(E2) values using IBM-1 for Z=44 are not calculated to describe the valuable information of nuclear structure by U(5) limit. The parameters in the formula are deduced based on the experimental energy level and value of B(E2, 2+->0+). The yrast states and transition strength B(E2) from 1st 4+ to 1st 2+, 1st 6+ to 1st 4+ and 1st 8+ to 1st 6+ states of Ru for even N= 60, 62 were calculated. The quadrupole moments, deformation parameters and U(5) limit were discussed for those nuclei.

Keywords: B(E2), energy level, ¹⁰⁴Ru, ¹⁰⁶Ru

Procedia PDF Downloads 330
6876 Activity-Based Costing in the Hospitality Industry: A Case Study in a Hotel

Authors: Bita Mashayekhi, Mohammad Ara

Abstract:

The purpose of this study is to provide some empirical evidence about implementing Activity-Based Costing (ABC) in the hospitality industry in Iran. For this purpose, we consider the Tabriz International Hotel as our sample hotel and then gather the relevant data from its cost accounting system in 2012. Then, we use ABC as our costing method and compare the cost of each service unit with that cost which had been extracted for the traditional costing method. The results show a different cost per unit for two methods. Also, because of its more precise and detailed provided information, an ABC system facilitates the decision-making process for managers on decisions related to profitability analysis, budgeting, pricing, and so on.

Keywords: Activity-Based Costing (ABC), activity, cost driver, hospitality industry

Procedia PDF Downloads 283
6875 ICTs Knowledge as a Way of Enhancing Literacy and Lifelong Learning in Nigeria

Authors: Jame O. Ezema, Odenigbo Veronica

Abstract:

The study covers the topic Information Communication and Technology (ICTs) knowledge as a way of enhancing Literacy and Lifelong learning in Nigeria. This work delved into defining of ICTs. Types of ICTs and media technologies were also mentioned. It further explained how ICTs can be strengthened and the uses of ICTs in education was duly emphasized. The paper also enumerated some side effects of ICTs on learners while the role of ICTs in enhancing literacy was explained. The study carried out strategies to use ICTs meaningfully in Literacy Programs and also emphasized the word lifelong learning in Nigeria. Some recommendations were made towards acquiring ICTs knowledge, so as to enhance Literacy and Lifelong learning in Nigeria.

Keywords: literacy, distance-learning, life-long learning for sustainable development, e-learning

Procedia PDF Downloads 485
6874 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies

Authors: Stefano Fantin

Abstract:

The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.

Keywords: cybersecurity, vulnerability, European Union, Japan

Procedia PDF Downloads 140
6873 Effects of Global Validity of Predictive Cues upon L2 Discourse Comprehension: Evidence from Self-paced Reading

Authors: Binger Lu

Abstract:

It remains unclear whether second language (L2) speakers could use discourse context cues to predict upcoming information as native speakers do during online comprehension. Some researchers propose that L2 learners may have a reduced ability to generate predictions during discourse processing. At the same time, there is evidence that discourse-level cues are weighed more heavily in L2 processing than in L1. Previous studies showed that L1 prediction is sensitive to the global validity of predictive cues. The current study aims to explore whether and to what extent L2 learners can dynamically and strategically adjust their prediction in accord with the global validity of predictive cues in L2 discourse comprehension as native speakers do. In a self-paced reading experiment, Chinese native speakers (N=128), C-E bilinguals (N=128), and English native speakers (N=128) read high-predictable (e.g., Jimmy felt thirsty after running. He wanted to get some water from the refrigerator.) and low-predictable (e.g., Jimmy felt sick this morning. He wanted to get some water from the refrigerator.) discourses in two-sentence frames. The global validity of predictive cues was manipulated by varying the ratio of predictable (e.g., Bill stood at the door. He opened it with the key.) and unpredictable fillers (e.g., Bill stood at the door. He opened it with the card.), such that across conditions, the predictability of the final word of the fillers ranged from 100% to 0%. The dependent variable was reading time on the critical region (the target word and the following word), analyzed with linear mixed-effects models in R. C-E bilinguals showed reliable prediction across all validity conditions (β = -35.6 ms, SE = 7.74, t = -4.601, p< .001), and Chinese native speakers showed significant effect (β = -93.5 ms, SE = 7.82, t = -11.956, p< .001) in two of the four validity conditions (namely, the High-validity and MedLow conditions, where fillers ended with predictable words in 100% and 25% cases respectively), whereas English native speakers didn’t predict at all (β = -2.78 ms, SE = 7.60, t = -.365, p = .715). There was neither main effect (χ^²(3) = .256, p = .968) nor interaction (Predictability: Background: Validity, χ^²(3) = 1.229, p = .746; Predictability: Validity, χ^²(3) = 2.520, p = .472; Background: Validity, χ^²(3) = 1.281, p = .734) of Validity with speaker groups. The results suggest that prediction occurs in L2 discourse processing but to a much less extent in L1, witha significant effect in some conditions of L1 Chinese and anull effect in L1 English processing, consistent with the view that L2 speakers are more sensitive to discourse cues compared with L1 speakers. Additionally, the pattern of L1 and L2 predictive processing was not affected by the global validity of predictive cues. C-E bilinguals’ predictive processing could be partly transferred from their L1, as prior research showed that discourse information played a more significant role in L1 Chinese processing.

Keywords: bilingualism, discourse processing, global validity, prediction, self-paced reading

Procedia PDF Downloads 125
6872 Semantic Data Schema Recognition

Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia

Abstract:

The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.

Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns

Procedia PDF Downloads 405
6871 Fair Value Accounting and Evolution of the Ohlson Model

Authors: Mohamed Zaher Bouaziz

Abstract:

Our study examines the Ohlson Model, which links a company's market value to its equity and net earnings, in the context of the evolution of the Canadian accounting model, characterized by more extensive use of fair value and a broader measure of performance after IFRS adoption. Our hypothesis is that if equity is reported at its fair value, this valuation is closely linked to market capitalization, so the weight of earnings weakens or even disappears in the Ohlson Model. Drawing on Canada's adoption of the International Financial Reporting Standards (IFRS), our results support our hypothesis that equity appears to include most of the relevant information for investors, while earnings have become less important. However, the predictive power of earnings does not disappear.

Keywords: fair value accounting, Ohlson model, IFRS adoption, value-relevance of equity and earnings

Procedia PDF Downloads 169
6870 Comparative Dielectric Properties of 1,2-Dichloroethane with n-Methylformamide and n,n-Dimethylformamide Using Time Domain Reflectometry Technique in Microwave Frequency

Authors: Shagufta Tabassum, V. P. Pawar, jr., G. N. Shinde

Abstract:

The study of dielectric relaxation properties of polar liquids in the binary mixture has been carried out at 10, 15, 20 and 25 ºC temperatures for 11 different concentrations using time domain reflectometry technique. The dielectric properties of a solute-solvent mixture of polar liquids in the frequency range of 10 MHz to 30 GHz gives the information regarding formation of monomers and multimers and also an interaction between the molecules of the liquid mixture under study. The dielectric parameters have been obtained by the least squares fit method using the Debye equation characterized by a single relaxation time without relaxation time distribution.

Keywords: excess properties, relaxation time, static dielectric constant, and time domain reflectometry technique

Procedia PDF Downloads 138
6869 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features

Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova

Abstract:

The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.

Keywords: emotion recognition, facial recognition, signal processing, machine learning

Procedia PDF Downloads 303
6868 Health Transformation Program and Effects on Health Expenditures

Authors: Zeynep Karacor, Rahime Hulya Ozturk

Abstract:

In recent years, the rise of population density and the problem of aging population took attention to the health expenditures. In Turkey, some regulations and infrastructure changes in health sector have occurred. These changes are called Health Transformation Program. The productivity of health services, patient satisfaction, quality of services are tried to be improved with this program. Some radical changes are applied in Turkish economy in this context. The aim of this paper is to present the effects of Health Transformation Program on health expenditures. In the first part of the paper, some information’s about health system and applications in Turkey are discussed. In the second part, the aims of Health Transformation Program are explained. And in the third part the effects of Health Transformation Program on health expenditures are examined.

Keywords: health transformation program, Turkey, health services, health expenditures

Procedia PDF Downloads 372
6867 RAPD Analysis of Genetic Diversity of Castor Bean

Authors: M. Vivodík, Ž. Balážová, Z. Gálová

Abstract:

The aim of this work was to detect genetic variability among the set of 40 castor genotypes using 8 RAPD markers. Amplification of genomic DNA of 40 genotypes, using RAPD analysis, yielded in 66 fragments, with an average of 8.25 polymorphic fragments per primer. Number of amplified fragments ranged from 3 to 13, with the size of amplicons ranging from 100 to 1200 bp. Values of the polymorphic information content (PIC) value ranged from 0.556 to 0.895 with an average of 0.784 and diversity index (DI) value ranged from 0.621 to 0.896 with an average of 0.798. The dendrogram based on hierarchical cluster analysis using UPGMA algorithm was prepared and analyzed genotypes were grouped into two main clusters and only two genotypes could not be distinguished. Knowledge on the genetic diversity of castor can be used for future breeding programs for increased oil production for industrial uses.

Keywords: dendrogram, polymorphism, RAPD technique, Ricinus communis L.

Procedia PDF Downloads 454
6866 Representations of Childcare Robots as a Controversial Issue

Authors: Raya A. Jones

Abstract:

This paper interrogates online representations of robot companions for children, including promotional material by manufacturers, media articles and technology blogs. The significance of the study lies in its contribution to understanding attitudes to robots. The prospect of childcare robots is particularly controversial ethically, and is associated with emotive arguments. The sampled material is restricted to relatively recent posts (the past three years) though the analysis identifies both continuous and changing themes across the past decade. The method extrapolates social representations theory towards examining the ways in which information about robotic products is provided for the general public. Implications for social acceptance of robot companions for the home and robot ethics are considered.

Keywords: acceptance of robots, childcare robots, ethics, social representations

Procedia PDF Downloads 233
6865 Anomaly Detection Based on System Log Data

Authors: M. Kamel, A. Hoayek, M. Batton-Hubert

Abstract:

With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.

Keywords: logs, anomaly detection, ML, scoring, NLP

Procedia PDF Downloads 76
6864 Unsupervised Learning of Spatiotemporally Coherent Metrics

Authors: Ross Goroshin, Joan Bruna, Jonathan Tompson, David Eigen, Yann LeCun

Abstract:

Current state-of-the-art classification and detection algorithms rely on supervised training. In this work we study unsupervised feature learning in the context of temporally coherent video data. We focus on feature learning from unlabeled video data, using the assumption that adjacent video frames contain semantically similar information. This assumption is exploited to train a convolutional pooling auto-encoder regularized by slowness and sparsity. We establish a connection between slow feature learning to metric learning and show that the trained encoder can be used to define a more temporally and semantically coherent metric.

Keywords: machine learning, pattern clustering, pooling, classification

Procedia PDF Downloads 438
6863 Escalation of Commitment and Turnover in Top Management Teams

Authors: Dmitriy V. Chulkov

Abstract:

Escalation of commitment is defined as continuation of a project after receiving negative information about it. While literature in management and psychology identified various factors contributing to escalation behavior, this phenomenon has received little analysis in economics, potentially due to the apparent irrationality of escalation. In this study, we present an economic model of escalation with asymmetric information in a principal-agent setup where the agents are responsible for a project selection decision and discover the outcome of the project before the principal. Our theoretical model complements the existing literature on several accounts. First, we link the incentive to escalate commitment to a project with the turnover decision by the manager. When a manager learns the outcome of the project and stops it that reveals that a mistake was made. There is an incentive to continue failing projects and avoid admitting the mistake. This incentive is enhanced when the agent may voluntarily resign from the firm before the outcome of the failing project is revealed, and thus not bear the full extent of reputation damage due to project failure. As long as some successful managers leave the firm for extraneous reasons, outside firms find it difficult to link failing projects with certainty to managers that left a firm. Second, we demonstrate that non-CEO managers have reputation concerns separate from those of the CEO, and thus may escalate commitment to projects they oversee, when such escalation can attenuate damage to reputation from impending project failure. Such incentive for escalation will be present for non-CEO managers if the CEO delegates responsibility for a project to a non-CEO executive. If reputation matters for promotion to the CEO, the incentive for a rising executive to escalate in order to protect reputation is distinct from that of a CEO. Third, our theoretical model is supported by empirical analysis of changes in the firm’s operations measured by the presence of discontinued operations at the time of turnover among the top four members of the top management team. Discontinued operations are indicative of termination of failing projects at a firm. The empirical results demonstrate that in a large dataset of over three thousand publicly traded U.S. firms for a period from 1993 to 2014 turnover by top executives significantly increases the likelihood that the firm discontinues operations. Furthermore, the type of turnover matters as this effect is strongest when at least one non-CEO member of the top management team leaves the firm and when the CEO departure is due to a voluntary resignation and not to a retirement or illness. Empirical results are consistent with the predictions of the theoretical model and suggest that escalation of commitment is primarily observed in decisions by non-CEO members of the top management team.

Keywords: discontinued operations, escalation of commitment, executive turnover, top management teams

Procedia PDF Downloads 352
6862 Assessment and Analysis of Literary Criticism and Consumer Research

Authors: Mohammad Mirzaei

Abstract:

This article proposes literary criticism as a source of insight into consumer behavior, provides an extensive overview of literary criticism, provides concrete illustrative analysis, and offers suggestions for further research. To do, a literary analysis of advertising copy identifies elements that provide additional information to consumer researchers and discusses the contribution of literary criticism to consumer research. Important post-war critical schools of thought are reviewed, and relevant theoretical concepts are summarized. Ivory Flakes' advertisements are analyzed using a variety of concepts drawn from literary schools, primarily sociocultural and reader responses. Suggestions for further research on content analysis, image analysis, and consumption history are presented.

Keywords: consumer behaviour, consumer research, consumption history, criticism

Procedia PDF Downloads 82
6861 Heart Murmurs and Heart Sounds Extraction Using an Algorithm Process Separation

Authors: Fatima Mokeddem

Abstract:

The phonocardiogram signal (PCG) is a physiological signal that reflects heart mechanical activity, is a promising tool for curious researchers in this field because it is full of indications and useful information for medical diagnosis. PCG segmentation is a basic step to benefit from this signal. Therefore, this paper presents an algorithm that serves the separation of heart sounds and heart murmurs in case they exist in order to use them in several applications and heart sounds analysis. The separation process presents here is founded on three essential steps filtering, envelope detection, and heart sounds segmentation. The algorithm separates the PCG signal into S1 and S2 and extract cardiac murmurs.

Keywords: phonocardiogram signal, filtering, Envelope, Detection, murmurs, heart sounds

Procedia PDF Downloads 125