Search results for: disseminating information
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10560

Search results for: disseminating information

9180 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System

Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu

Abstract:

Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.

Keywords: communication, GEO satellite, data relay system, coverage

Procedia PDF Downloads 418
9179 Seismic Microzoning and Resonant Map for Urban Planning

Authors: F. Tahiri, F. Grajçevci

Abstract:

The cities are coping with permanent demands to extend their residential and economical capacity. The new urban zones are sometimes induced to be developed in more vulnerable environments. This study is aimed to identify and mitigate the seismic hazards in the stage of urban planning for new settlements, including the existing urban environments which initially have not considered the seismic hazard. Seismic microzoning shall study the amplification/attenuation of seismic excitations from the bedrock to the ground surface. Modification of the seismic excitation is governed from the site specific ground conditions, presented on ground surface as mean values of the ratio of maximum accelerations at the surface versus acceleration of subsoil media – presented with dynamic amplification factors (DAF). The values shall be used to create the maps with isolines of DAF and then seismic microzoning with expected maximum mean surface acceleration as a product of DAF with maximum accelerations at bedrock. Development of resonant map shall conglomerate the information’s obtained from seismic microzoning in regard to expected predominant ground periods of seismic excitation and periods of vibrations of designed/built structures. These information’s shall be used as indispensible tool in early stages of urban planning to determine the most optimal zones for construction, the constructive materials, structural systems, range of buildings height, etc. so the resonance of soil media with built structures is avoided. The information’s could be used also for assessment of seismic risk and vulnerability-damageability of existing urban environments.

Keywords: vulnerable environment, mitigation, seismic microzoning, resonant map, urban planning

Procedia PDF Downloads 499
9178 Passing the Charity Walking Tours as a Poverty Reduction Establishment in Denpasar City, Bali

Authors: I. Wayan Wiwin

Abstract:

Poverty is one of the big problems faced by big cities in the world. Urbanization one cause, many rural communities trying to earn a living to the city in the hope that they can improve the level of economy, but not equipped with adequate skills so that it becomes an urban demographic problem. Denpasar as the capital of the province of Bali one of them, in the city area of Denpasar there are many slum dwellings inhabited by the poor, whereas Bali is known as one of the best tourist destinations in the world. This condition is very inversely proportional to the progress of tourism in Bali. For that it is necessary to attempt to overcome poverty in the city of Denpasar, one with the development of city tours in the form of charity walking tours, where tourists are invited to take a walk to see directly the state of the poor in the city of Denpasar and provide assistance to them in the form of home assistance, educational scholarships, health assistance, as well as skill and business capital assistance. This research is explorative-qualitative, that is exploring the potential of charity walking tour to overcome poverty in Denpasar City, which is written qualitatively. In the end based on potential data and information, then analyzed into a decision whether it is possible to develop. Therefore, this study only requires respondents or informants who are able to provide answers or qualitative information about matters related to the potential development of charity walking tour. Thus, informants in this study are tourism stakeholders, such as Municipal government officials, businessmen, community leaders and tourism actors, who are considered to be providing information relating to the development of urban tourism.

Keywords: tourism, city tours, charity walking tours, poverty

Procedia PDF Downloads 147
9177 Signaling Using Phase Shifting in Wi-Fi Backscatter System

Authors: Chang-Bin Ha, Young-Min Ko, Seongjoo Lee, Hyoung-Kyu Song

Abstract:

In this paper, the signaling scheme using phase shifting is proposed for the improved performance of the Wi-Fi backscatter system. Because the communication in the Wi-Fi backscatter system is based on on-off modulation and impedance modulation by unit of packet, the data rate is very low compared to the conventional wireless systems. Also, because the Wi-Fi backscatter system is based on the RF-powered device, the achievement of high reliability is difficult. In order to increase the low data rate, the proposed scheme transmits information of multiple bits during one packet period. Also, in order to increase the reliability, the proposed scheme shifts the phase of signal in according to the transmitting information. The simulation result shows that the proposed scheme has the improved throughput performance.

Keywords: phase shifting, RF-powered device, Wi-Fi backscatter system, IoT

Procedia PDF Downloads 419
9176 Risk Assessment of Building Information Modelling Adoption in Construction Projects

Authors: Amirhossein Karamoozian, Desheng Wu, Behzad Abbasnejad

Abstract:

Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.

Keywords: risk, BIM, fuzzy TOPSIS, construction projects

Procedia PDF Downloads 201
9175 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 30
9174 Females’ Usage Patterns of Information and Communication Technologies (ICTs) in the Vhembe District, South Africa

Authors: Fulufhelo Oscar Maphiri-Makananise

Abstract:

The main purpose of this paper is to explore and provide substantiated evidence based on the usage patterns of Information and Communication Technologies (ICTs) by females in the Vhembe District in Limpopo-Province, South Africa. The study presents a broader picture and understanding about the usage of ICTs from female’s perspective. The significance of this study stems from the need to discover the role, relevance and usage patterns of ICTs such as smartphones, computers, laptops, and iPods, internet and social networking sites among females following the trends of new media technologies in the society. The main objective of the study was to investigate the usability and accessibility of ICTs to empower the Vhembe District females in South Africa. The study used quantitative research method together with elements of qualitative research to determine the major ideas, perceptions and usage patterns of ICTs by females in the District. Data collection involved structured and self-administered questionnaire with both closed-ended and open-ended questions. Two groups of respondents participated in this study. Media Studies female students (n=50) at the University of Venda provided their ideas and perceptions about the usefulness and usage patterns of ICTs such as smartphones, internet and computers at the university level, while the second group were (n=50) Makhado comprehensive school learners who also provided their perceptions and ideas about the use of ICTs at the high school level. Also, the study provides a more balanced, accurate and rational results on the pertinent issues that concern the use of ICTs by females in the Vhembe District. The researcher also believes that the findings of the study are useful as a guideline and model for ICT intervention that work as an empowerment to women in South Africa. The study showed that the main purpose of using ICTs by females was to search information for writing assignments, conducting research, dating, exchanging ideas and networking with friends and relatives that are also members of social networking sites and maintaining existing friends in real life. The study further revealed that most females were using ICTs for social purposes and accessing the internet than entertaining themselves. The finding also indicated a high number of females that used ICTs for e-learning (62%) and social purposes (85%). Moreover, the study centred on providing strong insightful information on the females’ usage patterns and their perceptions of ICTs in the Vhembe district of Limpopo province.

Keywords: female users, information and communication technologies, internet, usage patterns

Procedia PDF Downloads 196
9173 A Model of Teacher Leadership in History Instruction

Authors: Poramatdha Chutimant

Abstract:

The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.

Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership

Procedia PDF Downloads 262
9172 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data

Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah

Abstract:

At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.

Keywords: Semantic Web, linked open data, database, statistic

Procedia PDF Downloads 162
9171 Teachers’ Experiences regarding Use of Information and Communication Technology for Visually Impaired Students

Authors: Zikra Faiz, Zaheer Asghar, Nisar Abid

Abstract:

Information and Communication Technologies (ICTs) includes computers, the Internet, and electronic delivery systems such as televisions, radios, multimedia, and overhead projectors etc. In the modern world, ICTs is considered as an essential element of the teaching-learning process. The study was aimed to discover the usage of ICTs in Special Education Institutions for Visually Impaired students, Lahore, Pakistan. Objectives of the study were to explore the problems faced by teachers while using ICT in the classroom. The study was phenomenology in nature; a qualitative survey method was used through a semi-structured interview protocol developed by the researchers. The sample comprised of eighty faculty members selected through a purposive sampling technique. Data were analyzed through thematic analysis technique with the help of open coding. The study findings revealed that multimedia, projectors, computers, laptops and LEDs are used in special education institutes to enhance the teaching-learning process. Teachers believed that ICTs could enhance the knowledge of visually impaired students and every student should use these technologies in the classroom. It was concluded that multimedia, projectors and laptops are used in classroom by teachers and students. ICTs can promote effectively through the training of teachers and students. It was suggested that the government should take steps to enhance ICTs in teacher training and other institutions by pre-service and in-service training of teachers.

Keywords: information and communication technologies, in-services teachers, special education institutions

Procedia PDF Downloads 105
9170 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 414
9169 Correlation Analysis between Sensory Processing Sensitivity (SPS), Meares-Irlen Syndrome (MIS) and Dyslexia

Authors: Kaaryn M. Cater

Abstract:

Students with sensory processing sensitivity (SPS), Meares-Irlen Syndrome (MIS) and dyslexia can become overwhelmed and struggle to thrive in traditional tertiary learning environments. An estimated 50% of tertiary students who disclose learning related issues are dyslexic. This study explores the relationship between SPS, MIS and dyslexia. Baseline measures will be analysed to establish any correlation between these three minority methods of information processing. SPS is an innate sensitivity trait found in 15-20% of the population and has been identified in over 100 species of animals. Humans with SPS are referred to as Highly Sensitive People (HSP) and the measure of HSP is a 27 point self-test known as the Highly Sensitive Person Scale (HSPS). A 2016 study conducted by the author established base-line data for HSP students in a tertiary institution in New Zealand. The results of the study showed that all participating HSP students believed the knowledge of SPS to be life-changing and useful in managing life and study, in addition, they believed that all tutors and in-coming students should be given information on SPS. MIS is a visual processing and perception disorder that is found in approximately 10% of the population and has a variety of symptoms including visual fatigue, headaches and nausea. One way to ease some of these symptoms is through the use of colored lenses or overlays. Dyslexia is a complex phonological based information processing variation present in approximately 10% of the population. An estimated 50% of dyslexics are thought to have MIS. The study exploring possible correlations between these minority forms of information processing is due to begin in February 2017. An invitation will be extended to all first year students enrolled in degree programmes across all faculties and schools within the institution. An estimated 900 students will be eligible to participate in the study. Participants will be asked to complete a battery of on-line questionnaires including the Highly Sensitive Person Scale, the International Dyslexia Association adult self-assessment and the adapted Irlen indicator. All three scales have been used extensively in literature and have been validated among many populations. All participants whose score on any (or some) of the three questionnaires suggest a minority method of information processing will receive an invitation to meet with a learning advisor, and given access to counselling services if they choose. Meeting with a learning advisor is not mandatory, and some participants may choose not to receive help. Data will be collected using the Question Pro platform and base-line data will be analysed using correlation and regression analysis to identify relationships and predictors between SPS, MIS and dyslexia. This study forms part of a larger three year longitudinal study and participants will be required to complete questionnaires at annual intervals in subsequent years of the study until completion of (or withdrawal from) their degree. At these data collection points, participants will be questioned on any additional support received relating to their minority method(s) of information processing. Data from this study will be available by April 2017.

Keywords: dyslexia, highly sensitive person (HSP), Meares-Irlen Syndrome (MIS), minority forms of information processing, sensory processing sensitivity (SPS)

Procedia PDF Downloads 213
9168 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 113
9167 Digitalize or Die-Responsible Innovations in Healthcare and Welfare Sectors

Authors: T. Iakovleva

Abstract:

Present paper suggests a theoretical model that describes the process of the development of responsible innovations on the firm level in health and welfare sectors. There is a need to develop new firm strategies in these sectors. This paper suggests to look on the concept of responsible innovation that was originally developed on the social level and to apply this new concept to the new area of firm strategy. The rapid global diffusion of information and communication technologies has greatly improved access to knowledge. At the same time, communication is cheap, information is a commodity, and global trade increases technological diffusion. As a result, firms and users, including those outside of industrialized nations, get early exposure to the latest technologies and information. General-purpose technologies such as mobile phones and 3D printers enable individuals to solve local needs and customize products. The combined effect of these changes is having a profound impact on the innovation landscape. Meanwhile, the healthcare sector is facing unprecedented challenges, which are magnified by budgetary constraints, an aging population and the desire to provide care for all. On the other hand, patients themselves are changing. They are savvier about their diseases, they expect their relation with the healthcare professionals to be open and interactive, but above all they want to be part of the decision process. All of this is a reflection of what is already happening in other industries where customers have access to large amount of information and became educated buyers. This article addresses the question of how ICT research and innovation may contribute to developing solutions to grand societal challenges in a responsible way. A broad definition of the concept of responsibility in the context of innovation is adopted in this paper. Responsibility is thus seen as a collective, uncertain and future-oriented activity. This opens the questions of how responsibilities are perceived and distributed and how innovation and science can be governed and stewarded towards socially desirable and acceptable ends. This article addresses a central question confronting politicians, business leaders, and regional planners.

Keywords: responsible innovation, ICT, healthcare, welfare sector

Procedia PDF Downloads 181
9166 Formation of Convergence Culture in the Framework of Conventional Media and New Media

Authors: Berkay Buluş, Aytekin İşman, Kübra Yüzüncüyıl

Abstract:

Developments in media and communication technologies have changed the way we use media. The importance of convergence culture has been increasing day by day within the framework of these developments. With new media, it is possible to say that social networks are the most powerful platforms that are integrated to this digitalization process. Although social networks seem like the place that people can socialize, they can also be utilized as places of production. On the other hand, audience has become users within the framework of transformation from national to global broadcasting. User generated contents make conventional media and new media collide. In this study, these communication platforms will be examined not as platforms that replace one another but mediums that unify each other. In the light of this information, information that is produced by users regarding new media platforms and all new media use practices are called convergence culture. In other words, convergence culture means intersections of conventional and new media. In this study, examples of convergence culture will be analyzed in detail.

Keywords: new media, convergence culture, convergence, use of new media, user generated content

Procedia PDF Downloads 246
9165 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate

Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe

Abstract:

This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.

Keywords: ARIMA, error metrices, model selection, SETAR

Procedia PDF Downloads 222
9164 Internet Versus Muslim Communities Challenges, Problems and Solutions

Authors: Bashir Muhammad

Abstract:

The present research contains the definition of the internet, the inter-relationship between and globalization as well as the divergent views of scholars on internet net-work. Additionally, both the positive and the negative impacts of the internet on Muslim communities were elucidated. As an example, it is part of the positive effect that the internet constitutes a vital source of vast information and data acquisition in various academic sciences in general and Islamic Studies in particular. The most recent and current facts and scientific discoveries by specialists of various ramifications could be fund as fast as possible. Many other exciting points were also cited. And on the negative side of the internet, among many other points, it releases uncontrolled promiscuous pictures and sometimes misguiding information about Islam, which could gradually and easily destroy the sound moral up bring of our young Muslim generation and pollute their positive thinking and reasoning. Another problem is that, Muslims in most cases pertaining to internet services are passive consumers, having no power to control it and manipulate it for their welfare and well being. Due to that, they have to pay the price for that, directly or indirectly.

Keywords: internet, muslim, challenges, communities

Procedia PDF Downloads 104
9163 How Validated Nursing Workload and Patient Acuity Data Can Promote Sustained Change and Improvements within District Health Boards. the New Zealand Experience

Authors: Rebecca Oakes

Abstract:

In the New Zealand public health system, work has been taking place to use electronic systems to convey data from the ‘floor to the board’ that makes patient needs, and therefore nursing work, visible. For nurses, these developments in health information technology puts us in a very new and exciting position of being able to articulate the work of nursing through a language understood at all levels of an organisation, the language of acuity. Nurses increasingly have a considerable stake-hold in patient acuity data. Patient acuity systems, when used well, can assist greatly in demonstrating how much work is required, the type of work, and when it will be required. The New Zealand Safe Staffing Unit is supporting New Zealand nurses to create a culture of shared governance, where nursing data is informing policies, staffing methodologies and forecasting within their organisations. Assisting organisations to understand their acuity data, strengthening user confidence in using electronic patient acuity systems, and ensuring nursing and midwifery workload is accurately reflected is critical to the success of the safe staffing programme. Nurses and midwives have the capacity via an acuity tool to become key informers of organisational planning. Quality patient care, best use of health resources and a quality work environment are essential components of a safe, resilient and well resourced organisation. Nurses are the key informers of this information. In New Zealand a national level approach is paving the way for significant changes to the understanding and use of patient acuity and nursing workload information.

Keywords: nursing workload, patient acuity, safe staffing, New Zealand

Procedia PDF Downloads 363
9162 Human-Automation Interaction in Law: Mapping Legal Decisions and Judgments, Cognitive Processes, and Automation Levels

Authors: Dovile Petkeviciute-Barysiene

Abstract:

Legal technologies not only create new ways for accessing and providing legal services but also transform the role of legal practitioners. Both lawyers and users of legal services expect automated solutions to outperform people with objectivity and impartiality. Although fairness of the automated decisions is crucial, research on assessing various characteristics of automated processes related to the perceived fairness has only begun. One of the major obstacles to this research is the lack of comprehensive understanding of what legal actions are automated and could be meaningfully automated, and to what extent. Neither public nor legal practitioners oftentimes cannot envision technological input due to the lack of general without illustrative examples. The aim of this study is to map decision making stages and automation levels which are and/or could be achieved in legal actions related to pre-trial and trial processes. Major legal decisions and judgments are identified during the consultations with legal practitioners. The dual-process model of information processing is used to describe cognitive processes taking place while making legal decisions and judgments during pre-trial and trial action. Some of the existing legal technologies are incorporated into the analysis as well. Several published automation level taxonomies are considered because none of them fit well into the legal context, as they were all created for avionics, teleoperation, unmanned aerial vehicles, etc. From the information processing perspective, analysis of the legal decisions and judgments expose situations that are most sensitive to cognitive bias, among others, also help to identify areas that would benefit from the automation the most. Automation level analysis, in turn, provides a systematic approach to interaction and cooperation between humans and algorithms. Moreover, an integrated map of legal decisions and judgments, information processing characteristics, and automation levels all together provide some groundwork for the research of legal technology perceived fairness and acceptance. Acknowledgment: This project has received funding from European Social Fund (project No 09.3.3-LMT-K-712-19-0116) under grant agreement with the Research Council of Lithuania (LMTLT).

Keywords: automation levels, information processing, legal judgment and decision making, legal technology

Procedia PDF Downloads 117
9161 Investigating the Impact of Individual Risk-Willingness and Group-Interaction Effects on Business Model Innovation Decisions

Authors: Sarah Müller-Sägebrecht

Abstract:

Today’s volatile environment challenges executives to make the right strategic decisions to gain sustainable success. Entrepreneurship scholars postulate mainly positive effects of environmental changes on entrepreneurship behavior, such as developing new business opportunities, promoting ingenuity, and the satisfaction of resource voids. A strategic solution approach to overcome threatening environmental changes and catch new business opportunities is business model innovation (BMI). Although this research stream has gained further importance in the last decade, BMI research is still insufficient. Especially BMI barriers, such as inefficient strategic decision-making processes, need to be identified. Strategic decisions strongly impact organizational future and are, therefore, usually made in groups. Although groups draw on a more extensive information base than single individuals, group-interaction effects can influence the decision-making process - in a favorable but also unfavorable way. Decisions are characterized by uncertainty and risk, whereby their intensity is perceived individually differently. Individual risk-willingness influences which option humans choose. The special nature of strategic decisions, such as in BMI processes, is that these decisions are not made individually but in groups due to their high organizational scope. These groups consist of different personalities whose individual risk-willingness can vary considerably. It is known from group decision theory that these individuals influence each other, observable in different group-interaction effects. The following research questions arise: i) Which impact has the individual risk-willingness on BMI decisions? And ii) how do group interaction effects impact BMI decisions? After conducting 26 in-depth interviews with executives from the manufacturing industry, the applied Gioia methodology reveals the following results: i) Risk-averse decision-makers have an increased need to be guided by facts. The more information available to them, the lower they perceive uncertainty and the more willing they are to pursue a specific decision option. However, the results also show that social interaction does not change the individual risk-willingness in the decision-making process. ii) Generally, it could be observed that during BMI decisions, group interaction is primarily beneficial to increase the group’s information base for making good decisions, less than for social interaction. Further, decision-makers mainly focus on information available to all decision-makers in the team but less on personal knowledge. This work contributes to strategic decision-making literature twofold. First, it gives insights into how group-interaction effects influence an organization’s strategic BMI decision-making. Second, it enriches risk-management research by highlighting how individual risk-willingness impacts organizational strategic decision-making. To date, it was known in BMI research that risk aversion would be an internal BMI barrier. However, with this study, it becomes clear that it is not risk aversion that inhibits BMI. Instead, the lack of information prevents risk-averse decision-makers from choosing a riskier option. Simultaneously, results show that risk-averse decision-makers are not easily carried away by the higher risk-willingness of their team members. Instead, they use social interaction to gather missing information. Therefore, executives need to provide sufficient information to all decision-makers to catch promising business opportunities.

Keywords: business model innovation, decision-making, group biases, group decisions, group-interaction effects, risk-willingness

Procedia PDF Downloads 76
9160 Independent Audit in Brazilian Companies Listed on B3: An Analysis of Companies That Received Qualified Opinion and Disclaimer of Opinion

Authors: Diego Saldo Alves, Marcelo Paveck Ayub

Abstract:

The quality of accounting information is very important for the decision-making of managers, investors government and other information users. The opinion of the independent audit has a significant influence on the decision-making, especially the investors. Therefore, the aim of this study is to analyze the reasons that companies listed on Brazilian Stock Exchange B3, if they received qualified opinion and disclaimer of opinion of the independent auditors. We analyzed the reports of the independent auditors of 23 Brazilian companies listed in B3 that received qualified opinion and disclaimer of opinion between the years 2012 and 2017. The findings show that the companies do not comply the International Financial Reporting Standard, IFRS, also they did not provide documentation to prove the operations performed, did not account expenses, problems in corporate governance and internal controls.

Keywords: audit, disclaimer of opinion, independent auditors, qualified opinion

Procedia PDF Downloads 175
9159 Searching Linguistic Synonyms through Parts of Speech Tagging

Authors: Faiza Hussain, Usman Qamar

Abstract:

Synonym-based searching is recognized to be a complicated problem as text mining from unstructured data of web is challenging. Finding useful information which matches user need from bulk of web pages is a cumbersome task. In this paper, a novel and practical synonym retrieval technique is proposed for addressing this problem. For replacement of semantics, user intent is taken into consideration to realize the technique. Parts-of-Speech tagging is applied for pattern generation of the query and a thesaurus for this experiment was formed and used. Comparison with Non-Context Based Searching, Context Based searching proved to be a more efficient approach while dealing with linguistic semantics. This approach is very beneficial in doing intent based searching. Finally, results and future dimensions are presented.

Keywords: natural language processing, text mining, information retrieval, parts-of-speech tagging, grammar, semantics

Procedia PDF Downloads 291
9158 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores

Authors: A. Ashraff

Abstract:

The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.

Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems

Procedia PDF Downloads 89
9157 Methodologies for Deriving Semantic Technical Information Using an Unstructured Patent Text Data

Authors: Jaehyung An, Sungjoo Lee

Abstract:

Patent documents constitute an up-to-date and reliable source of knowledge for reflecting technological advance, so patent analysis has been widely used for identification of technological trends and formulation of technology strategies. But, identifying technological information from patent data entails some limitations such as, high cost, complexity, and inconsistency because it rely on the expert’ knowledge. To overcome these limitations, researchers have applied to a quantitative analysis based on the keyword technique. By using this method, you can include a technological implication, particularly patent documents, or extract a keyword that indicates the important contents. However, it only uses the simple-counting method by keyword frequency, so it cannot take into account the sematic relationship with the keywords and sematic information such as, how the technologies are used in their technology area and how the technologies affect the other technologies. To automatically analyze unstructured technological information in patents to extract the semantic information, it should be transformed into an abstracted form that includes the technological key concepts. Specific sentence structure ‘SAO’ (subject, action, object) is newly emerged by representing ‘key concepts’ and can be extracted by NLP (Natural language processor). An SAO structure can be organized in a problem-solution format if the action-object (AO) states that the problem and subject (S) form the solution. In this paper, we propose the new methodology that can extract the SAO structure through technical elements extracting rules. Although sentence structures in the patents text have a unique format, prior studies have depended on general NLP (Natural language processor) applied to the common documents such as newspaper, research paper, and twitter mentions, so it cannot take into account the specific sentence structure types of the patent documents. To overcome this limitation, we identified a unique form of the patent sentences and defined the SAO structures in the patents text data. There are four types of technical elements that consist of technology adoption purpose, application area, tool for technology, and technical components. These four types of sentence structures from patents have their own specific word structure by location or sequence of the part of speech at each sentence. Finally, we developed algorithms for extracting SAOs and this result offer insight for the technology innovation process by providing different perspectives of technology.

Keywords: NLP, patent analysis, SAO, semantic-analysis

Procedia PDF Downloads 249
9156 Clustering Based and Centralized Routing Table Topology of Control Protocol in Mobile Wireless Sensor Networks

Authors: Mbida Mohamed, Ezzati Abdellah

Abstract:

A strong challenge in the wireless sensor networks (WSN) is to save the energy and have a long life time in the network without having a high rate of loss information. However, topology control (TC) protocols are designed in a way that the network is divided and having a standard system of exchange packets between nodes. In this article, we will propose a clustering based and centralized routing table protocol of TC (CBCRT) which delegates a leader node that will encapsulate a single routing table in every cluster nodes. Hence, if a node wants to send packets to the sink, it requests the information's routing table of the current cluster from the node leader in order to root the packet.

Keywords: mobile wireless sensor networks, routing, topology of control, protocols

Procedia PDF Downloads 246
9155 Global Emission Inventories of Air Pollutants from Combustion Sources

Authors: Shu Tao

Abstract:

Based on a global fuel consumption data product (PKU-FUEL-2007) compiled recently and a series of databases for emission factors of various sources, global emission inventories of a number of greenhouse gases and air pollutants, including CO2, CO, SO2, NOx, primary particulate matter (total, PM 10, and PM 2.5), black carbon, organic carbon, mercury, volatile organic carbons, and polycyclic aromatic hydrocarbons, from combustion sources have been developed. The inventories feather high spatial and sectorial resolutions. The spatial resolution of the inventories are 0.1 by 0.1 degree, based on a sub-national disaggregation approach to reduce spatial bias due to uneven distribution of per person fuel consumption within countries. The finely resolved inventories provide critical information for chemical transport modeling and exposure modeling. Emissions from more than 60 sources in energy, industry, agriculture, residential, transportation, and wildfire sectors were quantified in this study. With the detailed sectorial information, the inventories become an important tool for policy makers. For residential sector, a set of models were developed to simulate temporal variation of fuel consumption, consequently pollutant emissions. The models can be used to characterize seasonal as well as inter-annual variations in the emissions in history and to predict future changes. The models can even be used to quantify net change of fuel consumption and pollutant emissions due to climate change. The inventories has been used for model ambient air quality, population exposure, and even health effects. A few examples of the applications are discussed.

Keywords: air pollutants, combustion, emission inventory, sectorial information

Procedia PDF Downloads 356
9154 Impact of Digitized Monitoring & Evaluation System in Technical Vocational Education and Training

Authors: Abdul Ghani Rajput

Abstract:

Although monitoring and evaluation concept adopted by Technical Vocational Education and Training (TVET) organization to track the progress over the continuous interval of time based on planned interventions and subsequently, evaluating it for the impact, quality assurance and sustainability. In digital world, TVET providers are giving preference to have real time information to do monitoring of training activities. Identifying the benefits and challenges of digitized monitoring & evaluation real time information system has not been sufficiently tackled in this date. This research paper looks at the impact of digitized M&E in TVET sector by analyzing two case studies and describe the benefits and challenges of using digitized M&E system. Finally, digitized M&E have been identified as carriers for high potential of TVET sector.

Keywords: digitized M&E, innovation, quality assurance, TVET

Procedia PDF Downloads 199
9153 Potentials and Influencing Factors of Dynamic Pricing in Business: Empirical Insights of European Experts

Authors: Christopher Reichstein, Ralf-Christian Härting, Martina Häußler

Abstract:

With a continuously increasing speed of information exchange on the World Wide Web, retailers in the E-Commerce sector are faced with immense possibilities regarding different online purchase processes like dynamic price settings. By use of Dynamic Pricing, retailers are able to set short time price changes in order to optimize producer surplus. The empirical research illustrates the basics of Dynamic Pricing and identifies six influencing factors of Dynamic Pricing. The results of a structural equation modeling approach show five main drivers increasing the potential of dynamic price settings in the E-Commerce. Influencing factors are the knowledge of customers’ individual willingness to pay, rising sales, the possibility of customization, the data volume and the information about competitors’ pricing strategy.

Keywords: e-commerce, empirical research, experts, dynamic pricing (DP), influencing factors, potentials

Procedia PDF Downloads 238
9152 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 110
9151 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 91