Search results for: data acquisition performance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34095

Search results for: data acquisition performance

33525 Bridging the Data Gap for Sexism Detection in Twitter: A Semi-Supervised Approach

Authors: Adeep Hande, Shubham Agarwal

Abstract:

This paper presents a study on identifying sexism in online texts using various state-of-the-art deep learning models based on BERT. We experimented with different feature sets and model architectures and evaluated their performance using precision, recall, F1 score, and accuracy metrics. We also explored the use of pseudolabeling technique to improve model performance. Our experiments show that the best-performing models were based on BERT, and their multilingual model achieved an F1 score of 0.83. Furthermore, the use of pseudolabeling significantly improved the performance of the BERT-based models, with the best results achieved using the pseudolabeling technique. Our findings suggest that BERT-based models with pseudolabeling hold great promise for identifying sexism in online texts with high accuracy.

Keywords: large language models, semi-supervised learning, sexism detection, data sparsity

Procedia PDF Downloads 66
33524 Improving the Performance of Requisition Document Online System for Royal Thai Army by Using Time Series Model

Authors: D. Prangchumpol

Abstract:

This research presents a forecasting method of requisition document demands for Military units by using Exponential Smoothing methods to analyze data. The data used in the forecast is an actual data requisition document of The Adjutant General Department. The results of the forecasting model to forecast the requisition of the document found that Holt–Winters’ trend and seasonality method of α=0.1, β=0, γ=0 is appropriate and matches for requisition of documents. In addition, the researcher has developed a requisition online system to improve the performance of requisition documents of The Adjutant General Department, and also ensuring that the operation can be checked.

Keywords: requisition, holt–winters, time series, royal thai army

Procedia PDF Downloads 301
33523 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: climate, reanalysis, renewable energy, solar radiation

Procedia PDF Downloads 207
33522 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 360
33521 Development of an Information System Based on the Establishment and Evaluation of Performance Rating by Application Part/Type of Remodeling Element Technologies

Authors: Sungwon Jung

Abstract:

The percentage of 20 years or older apartment houses in South Korea is approximately 20% (1.55 million houses), and the explosive increase of aged houses is expected around the first planned new towns. Accordingly, we should prepare for social issues such as difficulty of housing lease and degradation of housing performance. The improvement of performance of aged houses is essential for achieving the national energy and carbon reduction goals, and we should develop techniques to respond to the changing construction environment. Furthermore, we should develop a performance evaluation system that is appropriate for the demands of residents such as the improvement of remodeling floor plan by performance improvement in line with the residence type of the housing vulnerable groups such as low-income group and elderly people living alone. For this purpose, remodeling techniques and business models optimized for the target complexes must be spread through the development of various business models. In addition, it is necessary to improve the remodeling business by improving the laws and systems related to the improvement of the residential performance and to prepare techniques to respond to the increasing business demands. In other words, performance improvement and evaluation and knowledge systems need to be researched as new issues related to remodeling that has not been addressed in the existing research.

Keywords: remodelling, performance evaluation, web-based system, big data

Procedia PDF Downloads 222
33520 Cash Flow Position and Corporate Performance: A Study of Selected Manufacturing Companies in Nigeria

Authors: Uzoma Emmanuel Igboji

Abstract:

The study investigates the effects of cash flow position on corporate performance in the manufacturing sector of Nigeria, using multiple regression techniques. The study involved a survey of five (5) manufacturing companies quoted on the Nigerian Stock Exchange. The data were obtained from the annual reports of the selected companies under study. The result shows that operating and financing cash flow have a significant positive relationship with corporate performance, while investing cash flow position have a significant negative relationship. The researcher recommended that the regulatory authorities should encourage external auditors of these quoted companies to use cash flow ratios in evaluating the performance of a company before expressing an independent opinion on the financial statement. The will give detailed financial information to existing and potential investors to make informed economic decisions.

Keywords: cash flow, financing, performance, operating

Procedia PDF Downloads 312
33519 Avoidance and Selectivity in the Acquisition of Arabic as a Second/Foreign Language

Authors: Abeer Heider

Abstract:

This paper explores and classifies the different kinds of avoidances that students commonly make in the acquisition of Arabic as a second/foreign language, and suggests specific strategies to help students lessen their avoidance trends in hopes of streamlining the learning process. Students most commonly use avoidance strategies in grammar, and word choice. These different types of strategies have different implications and naturally require different approaches. Thus the question remains as to the most effective way to help students improve their Arabic, and how teachers can efficiently utilize these techniques. It is hoped that this research will contribute to understand the role of avoidance in the field of the second language acquisition in general, and as a type of input. Yet some researchers also note that similarity between L1 and L2 may be problematic as well since the learner may doubt that such similarity indeed exists and consequently avoid the identical constructions or elements (Jordens, 1977; Kellermann, 1977, 1978, 1986). In an effort to resolve this issue, a case study is being conducted. The present case study attempts to provide a broader analysis of what is acquired than is usually the case, analyzing the learners ‘accomplishments in terms of three –part framework of the components of communicative competence suggested by Michele Canale: grammatical competence, sociolinguistic competence and discourse competence. The subjects of this study are 15 students’ 22th year who came to study Arabic at Qatar University of Cairo. The 15 students are in the advanced level. They were complete intermediate level in Arabic when they arrive in Qatar for the first time. The study used discourse analytic method to examine how the first language affects students’ production and output in the second language, and how and when students use avoidance methods in their learning. The study will be conducted through Fall 2015 through analyzing audio recordings that are recorded throughout the entire semester. The recordings will be around 30 clips. The students are using supplementary listening and speaking materials. The group will be tested at the end of the term to assess any measurable difference between the techniques. Questionnaires will be administered to teachers and students before and after the semester to assess any change in attitude toward avoidance and selectivity methods. Responses to these questionnaires are analyzed and discussed to assess the relative merits of the aforementioned strategies to avoidance and selectivity to further support on. Implications and recommendations for teacher training are proposed.

Keywords: the second language acquisition, learning languages, selectivity, avoidance

Procedia PDF Downloads 275
33518 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 129
33517 “To Err Is Human…” Revisiting Oral Error Correction in Class

Authors: David Steven Rosenstein

Abstract:

The widely accepted “Input Theory” of language acquisition proposes that language is basically acquired unconsciously through extensive exposure to all kinds of natural oral and written sources, especially those where the level of the input is slightly above the learner’s competence. As such, it implies that oral error correction by teachers in a classroom is unnecessary, a waste of time, and maybe even counterproductive. And yet, oral error correction by teachers in the classroom continues to be a very common phenomenon. While input theory advocates claim that such correction doesn’t work, interrupts a student’s train of thought, harms fluency, and may cause students embarrassment and fear, many teachers would disagree. They would claim that students know they make mistakes and want to be corrected in order to know they are improving, thereby encouraging students’ desire to keep studying. Moreover, good teachers can create a positive atmosphere where students will not be embarrassed or fearful. Perhaps now is the time to revisit oral error correction in the classroom and consider the results of research carried out long ago by the present speaker. The research indicates that oral error correction may be beneficial in many cases.

Keywords: input theory, language acquisition, teachers' corrections, recurrent errors

Procedia PDF Downloads 27
33516 Modern Imputation Technique for Missing Data in Linear Functional Relationship Model

Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, Rahmatullah Imon

Abstract:

Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in the LFRM. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.

Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators

Procedia PDF Downloads 396
33515 Investigation of Delivery of Triple Play Data in GE-PON Fiber to the Home Network

Authors: Ashima Anurag Sharma

Abstract:

Optical fiber based networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This research paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparison between various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be decreases due to increase in bit error rate.

Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT

Procedia PDF Downloads 523
33514 Aesthetic Preference and Consciousness in African Theatre: A Performance Appraisal of Tyrone Terrence's a Husband's Wife

Authors: Oluwatayo Isijola

Abstract:

The destructive influence of Europe on Africa has also taken a tow on the aesthetic essence of the African Art, which centres on morality and value for human life. In a parallel vein, the adverse turn of this influence on the dramaturgy of some contemporary African plays, poses impedance to audience consciousness in performance engagements. Through the spectrum of African Aesthetics, this study attempts a performance appraisal of A Husband’s wife; an unpublished play written by Tyrone Terence for the African audience. The researcher proffers two variant textual interpretations of the play to evaluate performance engagement in its default realistic mode, which holds an unresolved 'Medean-impulse', and another wherein the resolution is treated to a paradigm shift for aesthetic preference. The investigation employs the mixed method, which combines the quantitative and qualitative methodologies. Keen observation on the reactions and responses of audience members that were engaged in both performances, and on-the-spot interview with selected audience members, were the primary sources for the qualitative data. However, quantitative data was captured in an on-the-spot survey with the instrument of the questionnaire served to a sample population of the audience. The study observes that the preference for African aesthetics as exemplified in the second performance which deployed a paradigm shift did enhance audience consciousness. Hinging on performance aesthetic theory, the paper recommends that all such African plays bestowed with the shortcoming of African aesthetics, should be appropriately treated to paradigm shifts for performance engagement, in the interest of enhancing audience consciousness in the Nigerian Theatre.

Keywords: African aesthetics, audience consciousness, paradigm shift, median-impulse

Procedia PDF Downloads 328
33513 Corporate Performance and Balance Sheet Indicators: Evidence from Indian Manufacturing Companies

Authors: Hussain Bohra, Pradyuman Sharma

Abstract:

This study highlights the significance of Balance Sheet Indicators on the corporate performance in the case of Indian manufacturing companies. Balance sheet indicators show the actual financial health of the company and it helps to the external investors to choose the right company for their investment and it also help to external financing agency to give easy finance to the manufacturing companies. The period of study is 2000 to 2014 for 813 manufacturing companies for which the continuous data is available throughout the study period. The data is collected from PROWESS data base maintained by Centre for Monitoring Indian Economy Pvt. Ltd. Panel data methods like fixed effect and random effect methods are used for the analysis. The Likelihood Ratio test, Lagrange Multiplier test and Hausman test results proof the suitability of the fixed effect model for the estimation. Return on assets (ROA) is used as the proxy to measure corporate performance. ROA is the best proxy to measure corporate performance as it already used by the most of the authors who worked on the corporate performance. ROA shows return on long term investment projects of firms. Different ratios like Current Ratio, Debt-equity ratio, Receivable turnover ratio, solvency ratio have been used as the proxies for the Balance Sheet Indicators. Other firm specific variable like firm size, and sales as the control variables in the model. From the empirical analysis, it was found that all selected financial ratios have significant and positive impact on the corporate performance. Firm sales and firm size also found significant and positive impact on the corporate performance. To check the robustness of results, the sample was divided on the basis of different ratio like firm having high debt equity ratio and low debt equity ratio, firms having high current ratio and low current ratio, firms having high receivable turnover and low receivable ratio and solvency ratio in the form of firms having high solving ratio and low solvency ratio. We find that the results are robust to all types of companies having different form of selected balance sheet indicators ratio. The results for other variables are also in the same line as for the whole sample. These findings confirm that Balance sheet indicators play as significant role on the corporate performance in India. The findings of this study have the implications for the corporate managers to focus different ratio to maintain the minimum expected level of performance. Apart from that, they should also maintain adequate sales and total assets to improve corporate performance.

Keywords: balance sheet, corporate performance, current ratio, panel data method

Procedia PDF Downloads 261
33512 Investigating the Use of English Arabic Codeswitching in EFL classroom Oral Discourse Case study: Middle school pupils of Ain Fekroun, Wilaya of Oum El Bouaghi Algeria

Authors: Fadila Hadjeris

Abstract:

The study aims at investigating the functions of English-Arabic code switching in English as a foreign language classroom oral discourse and the extent to which they can contribute to the flow of classroom interaction. It also seeks to understand the views, beliefs, and perceptions of teachers and learners towards this practice. We hypothesized that code switching is a communicative strategy which facilitates classroom interaction. Due to this fact, both teachers and learners support its use. The study draws on a key body of literature in bilingualism, second language acquisition, and classroom discourse in an attempt to provide a framework for considering the research questions. It employs a combination of qualitative and quantitative research methods which include classroom observations and questionnaires. The analysis of the recordings shows that teachers’ code switching to Arabic is not only used for academic and classroom management reasons. Rather, the data display instances in which code switching is used for social reasons. The analysis of the questionnaires indicates that teachers and pupils have different attitudes towards this phenomenon. Teachers reported their deliberate switching during EFL teaching, yet the majority was against this practice. According to them, the use of the mother has detrimental effects on the acquisition and the practice of the target language. In contrast, pupils showed their preference to their teachers’ code switching because it enhances and facilitates their understanding. These findings support the fact that the shift to pupils’ mother tongue is a strategy which aids and facilitates the teaching and the learning of the target language. This, in turn, necessitates recommendations which are suggested to teachers and course designers.

Keywords: bilingualism, codeswitching, classroom interaction, classroom discourse, EFL learning/ teaching, SLA

Procedia PDF Downloads 471
33511 Effect of Recruitment and Selection on Employee Performance in Hospitality Industries

Authors: Yusuf A. Bako, Olubunmi O. Kolawole

Abstract:

This study sought to establish the effect of recruitment and selection on the employee performance in hospitality industries. The success of any organization in this modern business environment depends on the caliber of the manpower that steer the affairs of the organization. History has shown that recruitment and selection as a function of human resources management practices have a pivotal role in determining the level of employee performance in an organization. The hospitality industries have been faced with challenges of performance due to unconventional selection and placement practices in terms of poor policy in selecting candidate, inconsistency in selection process, sidetracking employment test and interview, godfatherism and regional selection process etc. The overall objective of the study was to determine how recruitment and selection affect employee performance in hospitality industry in Ogun State, Nigeria. This study adopts descriptive and inferential research design while population was drawn from leading hotels in Ogun State, Nigeria. The samples size was 100 employees and questionnaire was used to collect data while Cronbach alpha was used to test the instrument. The result of the study reveals that correlation between employee performance and recruitment and selection were highly significant.

Keywords: employee performance, human resources management, practices, recruitment, selection

Procedia PDF Downloads 370
33510 The Effect of the Internal Organization Communications' Effectiveness through Employee's Performance of Faculty of Management Science, Suan Sunandha Rajabhat University

Authors: Malaiphan Pansap, Surasit Vithayarat

Abstract:

The purpose of this study was to study the relationship between internal organization communications’ effectiveness and employee’s performance of Faculty of Management Science, Suan Sunandha Rajabhat University. Study on solutions of communication were carried out within the organization. Questionnaire was used to collect information from 136 people of staff and instructor and data were analyzed by using frequency, percentage, mean and standard deviation and then data processing statistic programs. The result found that organization communication that affects their employee’s performance is sender which lack the skills for speaking and writing to convince audiences ready before taking message and the message which organizations are not always informed. The employees believe the behavior of good organization communication has a positive impact on the development of organization because the employees feel involved and be a part of the organization, by the cooperation in working to achieve the goal, the employees can work in the same direction and meet goal quickly.

Keywords: employee’s performance, faculty of management science, internal organization communications’ effectiveness, management accounting, Suan Sunandha Rajabhat University

Procedia PDF Downloads 235
33509 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 439
33508 Technical Aspects of Closing the Loop in Depth-of-Anesthesia Control

Authors: Gorazd Karer

Abstract:

When performing a diagnostic procedure or surgery in general anesthesia (GA), a proper introduction and dosing of anesthetic agents are one of the main tasks of the anesthesiologist. However, depth of anesthesia (DoA) also seems to be a suitable process for closed-loop control implementation. To implement such a system, one must be able to acquire the relevant signals online and in real-time, as well as stream the calculated control signal to the infusion pump. However, during a procedure, patient monitors and infusion pumps are purposely unable to connect to an external (possibly medically unapproved) device for safety reasons, thus preventing closed-loop control. The paper proposes a conceptual solution to the aforementioned problem. First, it presents some important aspects of contemporary clinical practice. Next, it introduces the closed-loop-control-system structure and the relevant information flow. Focusing on transferring the data from the patient to the computer, it presents a non-invasive image-based system for signal acquisition from a patient monitor for online depth-of-anesthesia assessment. Furthermore, it introduces a UDP-based communication method that can be used for transmitting the calculated anesthetic inflow to the infusion pump. The proposed system is independent of a medical device manufacturer and is implemented in Matlab-Simulink, which can be conveniently used for DoA control implementation. The proposed scheme has been tested in a simulated GA setting and is ready to be evaluated in an operating theatre. However, the proposed system is only a step towards a proper closed-loop control system for DoA, which could routinely be used in clinical practice.

Keywords: closed-loop control, depth of anesthesia (DoA), modeling, optical signal acquisition, patient state index (PSi), UDP communication protocol

Procedia PDF Downloads 213
33507 Employee Job Performance and Supervisor Workplace Gossip Employee Job Engagement's Mediation Effect

Authors: Pphakamani Irvine Dlamini

Abstract:

The impact of supervisory gossip on subordinate work performance was investigated in this paper. The paper postulated that supervisory gossip, both bad and positive, has an impact on employee job engagement, which in turn has an impact on employee job performance. Data was collected from 238 employees and supervisors from the Mpumalanga Government Municipality in South Africa using a dyadic study approach. Employees responded to questions on supervisor gossip and job engagement, while supervisors responded to questions about employee work performance. Three waves of data gathering were carried out. Favourable superior gossip had a positive and substantial effect on employee job engagement, which increased employee job performance, according to the study, but negative superior gossip had a positive but insignificant effect on employee job engagement. The multicultural aspect of the municipality, as well as causation concerns and frequent method biases connected with research design, hampered the study. After successfully disentangling the supervisor-subordinate reciprocal communication web using Social Exchange Theory (SET), the study suggests that managers should instil effective ways for using both positive and negative gossip in the workplace to achieve favourable employee outcomes. Positive gossip creates workplace rivalry and competition, but negative gossip creates tension, stress, and mistrust among employees. This study attempted to assess the implication of supervisor gossip on employee job engagement and performance in the public service sector, whose employees are characterised by high job security as compared to their peers in the private sector.

Keywords: worlplace gossip, supervisor, employee engagement, LMX

Procedia PDF Downloads 119
33506 Increasing a Computer Performance by Overclocking Central Processing Unit (CPU)

Authors: Witthaya Mekhum, Wutthikorn Malikong

Abstract:

The objective of this study is to investigate the increasing desktop computer performance after overclocking central processing unit or CPU by running a computer component at a higher clock rate (more clock cycles per second) than it was designed at the rate of 0.1 GHz for each level or 100 MHz starting at 4000 GHz-4500 GHz. The computer performance is tested for each level with 4 programs, i.e. Hyper PI ver. 0.99b, Cinebench R15, LinX ver.0.6.4 and WinRAR . After the CPU overclock, the computer performance increased. When overclocking CPU at 29% the computer performance tested by Hyper PI ver. 0.99b increased by 10.03% and when tested by Cinebench R15 the performance increased by 20.05% and when tested by LinX Program the performance increased by 16.61%. However, the performance increased only 8.14% when tested with Winrar program. The computer performance did not increase according to the overclock rate because the computer consists of many components such as Random Access Memory or RAM, Hard disk Drive, Motherboard and Display Card, etc.

Keywords: overclock, performance, central processing unit, computer

Procedia PDF Downloads 279
33505 Biosignal Recognition for Personal Identification

Authors: Hadri Hussain, M.Nasir Ibrahim, Chee-Ming Ting, Mariani Idroas, Fuad Numan, Alias Mohd Noor

Abstract:

A biometric security system has become an important application in client identification and verification system. A conventional biometric system is normally based on unimodal biometric that depends on either behavioural or physiological information for authentication purposes. The behavioural biometric depends on human body biometric signal (such as speech) and biosignal biometric (such as electrocardiogram (ECG) and phonocardiogram or heart sound (HS)). The speech signal is commonly used in a recognition system in biometric, while the ECG and the HS have been used to identify a person’s diseases uniquely related to its cluster. However, the conventional biometric system is liable to spoof attack that will affect the performance of the system. Therefore, a multimodal biometric security system is developed, which is based on biometric signal of ECG, HS, and speech. The biosignal data involved in the biometric system is initially segmented, with each segment Mel Frequency Cepstral Coefficients (MFCC) method is exploited for extracting the feature. The Hidden Markov Model (HMM) is used to model the client and to classify the unknown input with respect to the modal. The recognition system involved training and testing session that is known as client identification (CID). In this project, twenty clients are tested with the developed system. The best overall performance at 44 kHz was 93.92% for ECG and the worst overall performance was ECG at 88.47%. The results were compared to the best overall performance at 44 kHz for (20clients) to increment of clients, which was 90.00% for HS and the worst overall performance falls at ECG at 79.91%. It can be concluded that the difference multimodal biometric has a substantial effect on performance of the biometric system and with the increment of data, even with higher frequency sampling, the performance still decreased slightly as predicted.

Keywords: electrocardiogram, phonocardiogram, hidden markov model, mel frequency cepstral coeffiecients, client identification

Procedia PDF Downloads 275
33504 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 125
33503 The Challenges of Digital Crime Nowadays

Authors: Bendes Ákos

Abstract:

Digital evidence will be the most widely used type of evidence in the future. With the development of the modern world, more and more new types of crimes have evolved and transformed. For this reason, it is extremely important to examine these types of crimes in order to get a comprehensive picture of them, with which we can help the authorities work. In 1865, with early technologies, people were able to forge a picture of a quality that is not even recognized today. With the help of today's technology, authorities receive a lot of false evidence. Officials are not able to process such a large amount of data, nor do they have the necessary technical knowledge to get a real picture of the authenticity of the given evidence. The digital world has many dangers. Unfortunately, we live in an age where we must protect everything digitally: our phones, our computers, our cars, and all the smart devices that are present in our personal lives and this is not only a burden on us, since companies, state and public utilities institutions are also forced to do so. The training of specialists and experts is essential so that the authorities can manage the incoming digital evidence at some level. When analyzing evidence, it is important to be able to examine it from the moment it is created. Establishing authenticity is a very important issue during official procedures. After the proper acquisition of the evidence, it is essential to store it safely and use it professionally. After the proper acquisition of the evidence, it is essential to store it safely and use it professionally. Otherwise, they will not have sufficient probative value and in case of doubt, the court will always decide in favor of the defendant. One of the most common problems in the world of digital data and evidence is doubt, which is why it is extremely important to examine the above-mentioned problems. The most effective way to avoid digital crimes is to prevent them, for which proper education and knowledge are essential. The aim is to present the dangers inherent in the digital world and the new types of digital crimes. After the comparison of the Hungarian investigative techniques with international practice, modernizing proposals will be given. A sufficiently stable yet flexible legislation is needed that can monitor the rapid changes in the world and not regulate afterward but rather provide an appropriate framework. It is also important to be able to distinguish between digital and digitalized evidence, as the degree of probative force differs greatly. The aim of the research is to promote effective international cooperation and uniform legal regulation in the world of digital crimes.

Keywords: digital crime, digital law, cyber crime, international cooperation, new crimes, skepticism

Procedia PDF Downloads 60
33502 Controversies Connected with the Admission of Illegally Gained Evidences in Polish Civil Proceedings

Authors: Aleksandra Czubak

Abstract:

The need to present evidence in civil proceedings is essential for getting the right result. It is for this reason that it is particularly important for the parties to present the most relevant and convincing evidence to the Court. Therefore, parties often try to gain evidence, even when the acquisition of such evidence is in breach of the law. Firstly, there will be discussed how evidence is applied in the Polish civil process and the Polish regulations of the evidence proceedings; with specific reference to evidence of major importance in the developing world. Further, it will be discussed the controversies connected with the admission of illegally gained evidence in civil proceedings. The credibility of the various measures is circumstantial and can only be determined by factors related to the recognized problem. For that reason, it is not the amount of evidence, but the value and relevance of this evidence that should be considered in determining the right result. This paper will also consider whether the end justifies the means? How far should parties go in order to achieve a favorable sentence or to create stronger evidence? Methods of persuasion of the court, as well as the acquisition of evidence, are not always fair and moral. It is on this area of controversy that this essay will focus. This paper concludes by considering the value of evidence and the possibility of using it to achieve a just sentence. Examples are based on Polish law; nevertheless, they encompass ideas common to most civil jurisdictions.

Keywords: civil proceedings, Europe (Poland), evidence, law

Procedia PDF Downloads 246
33501 Performance Evaluation and Planning for Road Safety Measures Using Data Envelopment Analysis and Fuzzy Decision Making

Authors: Hamid Reza Behnood, Esmaeel Ayati, Tom Brijs, Mohammadali Pirayesh Neghab

Abstract:

Investment projects in road safety planning can benefit from an effectiveness evaluation regarding their expected safety outcomes. The objective of this study is to develop a decision support system (DSS) to support policymakers in taking the right choice in road safety planning based on the efficiency of previously implemented safety measures in a set of regions in Iran. The measures considered for each region in the study include performance indicators about (1) police operations, (2) treated black spots, (3) freeway and highway facility supplies, (4) speed control cameras, (5) emergency medical services, and (6) road lighting projects. To this end, inefficiency measure is calculated, defined by the proportion of fatality rates in relation to the combined measure of road safety performance indicators (i.e., road safety measures) which should be minimized. The relative inefficiency for each region is modeled by the Data Envelopment Analysis (DEA) technique. In a next step, a fuzzy decision-making system is constructed to convert the information obtained from the DEA analysis into a rule-based system that can be used by policy makers to evaluate the expected outcomes of certain alternative investment strategies in road safety.

Keywords: performance indicators, road safety, decision support system, data envelopment analysis, fuzzy reasoning

Procedia PDF Downloads 348
33500 Performance Analysis of Geophysical Database Referenced Navigation: The Combination of Gravity Gradient and Terrain Using Extended Kalman Filter

Authors: Jisun Lee, Jay Hyoun Kwon

Abstract:

As an alternative way to compensate the INS (inertial navigation system) error in non-GNSS (Global Navigation Satellite System) environment, geophysical database referenced navigation is being studied. In this study, both gravity gradient and terrain data were combined to complement the weakness of sole geophysical data as well as to improve the stability of the positioning. The main process to compensate the INS error using geophysical database was constructed on the basis of the EKF (Extended Kalman Filter). In detail, two type of combination method, centralized and decentralized filter, were applied to check the pros and cons of its algorithm and to find more robust results. The performance of each navigation algorithm was evaluated based on the simulation by supposing that the aircraft flies with precise geophysical DB and sensors above nine different trajectories. Especially, the results were compared to the ones from sole geophysical database referenced navigation to check the improvement due to a combination of the heterogeneous geophysical database. It was found that the overall navigation performance was improved, but not all trajectories generated better navigation result by the combination of gravity gradient with terrain data. Also, it was found that the centralized filter generally showed more stable results. It is because that the way to allocate the weight for the decentralized filter could not be optimized due to the local inconsistency of geophysical data. In the future, switching of geophysical data or combining different navigation algorithm are necessary to obtain more robust navigation results.

Keywords: Extended Kalman Filter, geophysical database referenced navigation, gravity gradient, terrain

Procedia PDF Downloads 343
33499 Evaluation of Machine Learning Algorithms and Ensemble Methods for Prediction of Students’ Graduation

Authors: Soha A. Bahanshal, Vaibhav Verdhan, Bayong Kim

Abstract:

Graduation rates at six-year colleges are becoming a more essential indicator for incoming fresh students and for university rankings. Predicting student graduation is extremely beneficial to schools and has a huge potential for targeted intervention. It is important for educational institutions since it enables the development of strategic plans that will assist or improve students' performance in achieving their degrees on time (GOT). A first step and a helping hand in extracting useful information from these data and gaining insights into the prediction of students' progress and performance is offered by machine learning techniques. Data analysis and visualization techniques are applied to understand and interpret the data. The data used for the analysis contains students who have graduated in 6 years in the academic year 2017-2018 for science majors. This analysis can be used to predict the graduation of students in the next academic year. Different Predictive modelings such as logistic regression, decision trees, support vector machines, Random Forest, Naïve Bayes, and KNeighborsClassifier are applied to predict whether a student will graduate. These classifiers were evaluated with k folds of 5. The performance of these classifiers was compared based on accuracy measurement. The results indicated that Ensemble Classifier achieves better accuracy, about 91.12%. This GOT prediction model would hopefully be useful to university administration and academics in developing measures for assisting and boosting students' academic performance and ensuring they graduate on time.

Keywords: prediction, decision trees, machine learning, support vector machine, ensemble model, student graduation, GOT graduate on time

Procedia PDF Downloads 69
33498 The Impact of Study Abroad Experience on Interpreting Performance

Authors: Ruiyuan Wang, Jing Han, Bruno Di Biase, Mark Antoniou

Abstract:

The purpose of this study is to explore the relationship between working memory (WM) capacity and Chinese-English consecutive interpreting (CI) performance in interpreting learners with different study abroad experience (SAE). Such relationship is not well understood. This study also examines whether Chinese interpreting learners with SAE in English-speaking countries, demonstrate a better performance in inflectional morphology and agreement, notoriously unstable in Chinese speakers of English L2, in their interpreting output than learners without SAE. Fifty Chinese university students, majoring in Chinese-English Interpreting, were recruited in Australia (n=25) and China (n=25). The two groups matched in age, language proficiency, and interpreting training period. Study abroad (SA) group has been studying in an English-speaking country (Australia) for over 12 months, and none of the students recruited in China (the no study abroad = NSA group) had ever studied or lived in an English-speaking country. Data on language proficiency and training background were collected via a questionnaire. Lexical retrieval performance and working memory (WM) capacity data were collected experimentally, and finally, interpreting data was elicited via a direct CI task. Main results of the study show that WM significantly correlated with participants' CI performance independently of learning context. Moreover, SA outperformed NSA learners in terms of subject-verb number agreement. Apart from that, WM capacity was also found to correlate significantly with their morphosyntactic accuracy. This paper sheds some light on the relationship between study abroad, WM capacity, and CI performance. Exploring the effect of study abroad on interpreting trainees and how various important factors correlate may help interpreting educators bring forward more targeted teaching paradigms for participants with different learning experiences.

Keywords: study abroad experience, consecutive interpreting, working memory, inflectional agreement

Procedia PDF Downloads 98
33497 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 423
33496 Decision Quality as an Antecedent to Export Performance. Empirical Evidence under a Contingency Theory Lens

Authors: Evagelos Korobilis-Magas, Adekunle Oke

Abstract:

The constantly increasing tendency towards a global economy and the subsequent increase in exporting, as a result, has inevitably led to a growing interest in the topic of export success as well. Numerous studies, particularly in the past three decades, have examined a plethora of determinants to export performance. However, to the authors' best knowledge, no study up to date has ever considered decision quality as a potential antecedent to export success by attempting to test the relationship between decision quality and export performance. This is a surprising deficiency given that the export marketing literature has long ago suggested that quality decisions are regarded as the crucial intervening variable between sound decision–making and export performance. This study integrates the different definitions of decision quality proposed in the literature and the key themes incorporated therein and adapts it to an export context. Apart from laying the conceptual foundations for the delineation of this elusive but very important construct, this study is the first ever to test the relationship between decision quality and export performance. Based on survey data from a sample of 189 British export decision-makers and within a contingency theory framework, the results reveal that there is a direct, positive link between decision quality and export performance. This finding opens significant future research avenues and has very important implications for both theory and practice.

Keywords: export performance, decision quality, mixed methods, contingency theory

Procedia PDF Downloads 88