Search results for: information recognition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12101

Search results for: information recognition

9641 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology

Authors: Amarendar Reddy Addula

Abstract:

Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.

Keywords: artificial intelligence, ethics & human rights issues, laws, international laws

Procedia PDF Downloads 95
9640 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 436
9639 Correlation Analysis between Sensory Processing Sensitivity (SPS), Meares-Irlen Syndrome (MIS) and Dyslexia

Authors: Kaaryn M. Cater

Abstract:

Students with sensory processing sensitivity (SPS), Meares-Irlen Syndrome (MIS) and dyslexia can become overwhelmed and struggle to thrive in traditional tertiary learning environments. An estimated 50% of tertiary students who disclose learning related issues are dyslexic. This study explores the relationship between SPS, MIS and dyslexia. Baseline measures will be analysed to establish any correlation between these three minority methods of information processing. SPS is an innate sensitivity trait found in 15-20% of the population and has been identified in over 100 species of animals. Humans with SPS are referred to as Highly Sensitive People (HSP) and the measure of HSP is a 27 point self-test known as the Highly Sensitive Person Scale (HSPS). A 2016 study conducted by the author established base-line data for HSP students in a tertiary institution in New Zealand. The results of the study showed that all participating HSP students believed the knowledge of SPS to be life-changing and useful in managing life and study, in addition, they believed that all tutors and in-coming students should be given information on SPS. MIS is a visual processing and perception disorder that is found in approximately 10% of the population and has a variety of symptoms including visual fatigue, headaches and nausea. One way to ease some of these symptoms is through the use of colored lenses or overlays. Dyslexia is a complex phonological based information processing variation present in approximately 10% of the population. An estimated 50% of dyslexics are thought to have MIS. The study exploring possible correlations between these minority forms of information processing is due to begin in February 2017. An invitation will be extended to all first year students enrolled in degree programmes across all faculties and schools within the institution. An estimated 900 students will be eligible to participate in the study. Participants will be asked to complete a battery of on-line questionnaires including the Highly Sensitive Person Scale, the International Dyslexia Association adult self-assessment and the adapted Irlen indicator. All three scales have been used extensively in literature and have been validated among many populations. All participants whose score on any (or some) of the three questionnaires suggest a minority method of information processing will receive an invitation to meet with a learning advisor, and given access to counselling services if they choose. Meeting with a learning advisor is not mandatory, and some participants may choose not to receive help. Data will be collected using the Question Pro platform and base-line data will be analysed using correlation and regression analysis to identify relationships and predictors between SPS, MIS and dyslexia. This study forms part of a larger three year longitudinal study and participants will be required to complete questionnaires at annual intervals in subsequent years of the study until completion of (or withdrawal from) their degree. At these data collection points, participants will be questioned on any additional support received relating to their minority method(s) of information processing. Data from this study will be available by April 2017.

Keywords: dyslexia, highly sensitive person (HSP), Meares-Irlen Syndrome (MIS), minority forms of information processing, sensory processing sensitivity (SPS)

Procedia PDF Downloads 245
9638 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 131
9637 Digitalize or Die-Responsible Innovations in Healthcare and Welfare Sectors

Authors: T. Iakovleva

Abstract:

Present paper suggests a theoretical model that describes the process of the development of responsible innovations on the firm level in health and welfare sectors. There is a need to develop new firm strategies in these sectors. This paper suggests to look on the concept of responsible innovation that was originally developed on the social level and to apply this new concept to the new area of firm strategy. The rapid global diffusion of information and communication technologies has greatly improved access to knowledge. At the same time, communication is cheap, information is a commodity, and global trade increases technological diffusion. As a result, firms and users, including those outside of industrialized nations, get early exposure to the latest technologies and information. General-purpose technologies such as mobile phones and 3D printers enable individuals to solve local needs and customize products. The combined effect of these changes is having a profound impact on the innovation landscape. Meanwhile, the healthcare sector is facing unprecedented challenges, which are magnified by budgetary constraints, an aging population and the desire to provide care for all. On the other hand, patients themselves are changing. They are savvier about their diseases, they expect their relation with the healthcare professionals to be open and interactive, but above all they want to be part of the decision process. All of this is a reflection of what is already happening in other industries where customers have access to large amount of information and became educated buyers. This article addresses the question of how ICT research and innovation may contribute to developing solutions to grand societal challenges in a responsible way. A broad definition of the concept of responsibility in the context of innovation is adopted in this paper. Responsibility is thus seen as a collective, uncertain and future-oriented activity. This opens the questions of how responsibilities are perceived and distributed and how innovation and science can be governed and stewarded towards socially desirable and acceptable ends. This article addresses a central question confronting politicians, business leaders, and regional planners.

Keywords: responsible innovation, ICT, healthcare, welfare sector

Procedia PDF Downloads 198
9636 First Record of Eotragus noyei from the Middle Siwalik Dhok Pathan Formation of Pakistan

Authors: Abdul M. Khan, Hafiza I. Naz, Ayesha Iqbal, Muhammad Akhtar

Abstract:

The fossil remains described in this study have been recovered during fieldwork by the authors from the Dhok Pathan Formation of Middle Siwaliks Pakistan in December, 2015. The sample comprises maxillary and mandibular fragments along with isolated upper and lower teeth. The morphometric analysis of the specimens led us to recognize the sample as belonging to Eotragus noyei, which has been considered as the smallest and the oldest bovid in the Siwaliks. Eotragus noyei is characterized by brachydont teeth, finely rugose enamel, more inclined buccal walls of the molars and small lingual cingula. The inclination of the metaconal area has caused rotation of the metastyle in relation to the antero-posterior tooth axis and thus situated more lingually. The protocone in second upper premolar is well developed and situated posteriorly and also has an anterior lingual constriction. The metaconule in the third upper molar is smaller than the protocone. The dentition in Eotragus noyei is smaller in size as compared to Eotragus sansaniensis and Eotragus lampangensis. In Eotragus noyei the buccal walls in molars are more inclined while in Eotragus sansaniensis they are less inclined. The genus Eotragus has been reported previously in the Lower and Middle Siwaliks of Pakistan; however, the recognition of the present sample as Eotragus noyei has extended the range of this species from Lower to the Middle Siwaliks of Pakistan.

Keywords: Boselaphini, Chakwal, Dhok Pathan, late miocene

Procedia PDF Downloads 293
9635 Formation of Convergence Culture in the Framework of Conventional Media and New Media

Authors: Berkay Buluş, Aytekin İşman, Kübra Yüzüncüyıl

Abstract:

Developments in media and communication technologies have changed the way we use media. The importance of convergence culture has been increasing day by day within the framework of these developments. With new media, it is possible to say that social networks are the most powerful platforms that are integrated to this digitalization process. Although social networks seem like the place that people can socialize, they can also be utilized as places of production. On the other hand, audience has become users within the framework of transformation from national to global broadcasting. User generated contents make conventional media and new media collide. In this study, these communication platforms will be examined not as platforms that replace one another but mediums that unify each other. In the light of this information, information that is produced by users regarding new media platforms and all new media use practices are called convergence culture. In other words, convergence culture means intersections of conventional and new media. In this study, examples of convergence culture will be analyzed in detail.

Keywords: new media, convergence culture, convergence, use of new media, user generated content

Procedia PDF Downloads 271
9634 Spatiotemporal Analysis of Visual Evoked Responses Using Dense EEG

Authors: Rima Hleiss, Elie Bitar, Mahmoud Hassan, Mohamad Khalil

Abstract:

A comprehensive study of object recognition in the human brain requires combining both spatial and temporal analysis of brain activity. Here, we are mainly interested in three issues: the time perception of visual objects, the ability of discrimination between two particular categories (objects vs. animals), and the possibility to identify a particular spatial representation of visual objects. Our experiment consisted of acquiring dense electroencephalographic (EEG) signals during a picture-naming task comprising a set of objects and animals’ images. These EEG responses were recorded from nine participants. In order to determine the time perception of the presented visual stimulus, we analyzed the Event Related Potentials (ERPs) derived from the recorded EEG signals. The analysis of these signals showed that the brain perceives animals and objects with different time instants. Concerning the discrimination of the two categories, the support vector machine (SVM) was applied on the instantaneous EEG (excellent temporal resolution: on the order of millisecond) to categorize the visual stimuli into two different classes. The spatial differences between the evoked responses of the two categories were also investigated. The results showed a variation of the neural activity with the properties of the visual input. Results showed also the existence of a spatial pattern of electrodes over particular regions of the scalp in correspondence to their responses to the visual inputs.

Keywords: brain activity, categorization, dense EEG, evoked responses, spatio-temporal analysis, SVM, time perception

Procedia PDF Downloads 422
9633 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate

Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe

Abstract:

This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.

Keywords: ARIMA, error metrices, model selection, SETAR

Procedia PDF Downloads 244
9632 Internet Versus Muslim Communities Challenges, Problems and Solutions

Authors: Bashir Muhammad

Abstract:

The present research contains the definition of the internet, the inter-relationship between and globalization as well as the divergent views of scholars on internet net-work. Additionally, both the positive and the negative impacts of the internet on Muslim communities were elucidated. As an example, it is part of the positive effect that the internet constitutes a vital source of vast information and data acquisition in various academic sciences in general and Islamic Studies in particular. The most recent and current facts and scientific discoveries by specialists of various ramifications could be fund as fast as possible. Many other exciting points were also cited. And on the negative side of the internet, among many other points, it releases uncontrolled promiscuous pictures and sometimes misguiding information about Islam, which could gradually and easily destroy the sound moral up bring of our young Muslim generation and pollute their positive thinking and reasoning. Another problem is that, Muslims in most cases pertaining to internet services are passive consumers, having no power to control it and manipulate it for their welfare and well being. Due to that, they have to pay the price for that, directly or indirectly.

Keywords: internet, muslim, challenges, communities

Procedia PDF Downloads 121
9631 Identification and Characterization of Genes Expressed in Diseased Condition Silkworms (Bombyx mori): A Systematic Investigation

Authors: Siddharth Soni, Gourav Kumar Pandey, Sneha Kumari, Dev Mani Pandey, Koel Mukherjee

Abstract:

The silkworm Bombyx mori is a commercially important insect, but a major roadblock in silk production are silkworm diseases. Flacherie is one of the diseases of the silkworm, that affects the midgut of the 4th and 5th instar larvae and eventually makes them lethargic, stop feeding and finally result in their death. The concerned disease is a result of bacterial and viral infection and in some instances a combination of both. The present study aims to identify and study the expression level of genes in the flacherie condition. For the said work, total RNA was isolated from the infected larvae at their most probable infectious instar and cDNA was synthesized using Reverse Transcriptase PCR (RT-PCR). This cDNA was then used to amplify disease relalted genes whose expression levels were checked using quantitaive PCR (qPCR) using the double delta Ct method. Cry toxin receptors like APN and BtR-175, ROS mediator Dual Oxidase are few proteins whose genes were overexpressed. Interestingly, pattern recognition receptors (PRRs) C-type lectins' genes were found to be downregulated. The results explain about the strong expression of genes that can distinguish the concerned protein in the midgut of diseased silkworm and thereby aiding knowledge in the field of inhibitor designing research.

Keywords: Bombyx mori, flacherie disease, inhibitor designing, up and down regulation

Procedia PDF Downloads 285
9630 How Validated Nursing Workload and Patient Acuity Data Can Promote Sustained Change and Improvements within District Health Boards. the New Zealand Experience

Authors: Rebecca Oakes

Abstract:

In the New Zealand public health system, work has been taking place to use electronic systems to convey data from the ‘floor to the board’ that makes patient needs, and therefore nursing work, visible. For nurses, these developments in health information technology puts us in a very new and exciting position of being able to articulate the work of nursing through a language understood at all levels of an organisation, the language of acuity. Nurses increasingly have a considerable stake-hold in patient acuity data. Patient acuity systems, when used well, can assist greatly in demonstrating how much work is required, the type of work, and when it will be required. The New Zealand Safe Staffing Unit is supporting New Zealand nurses to create a culture of shared governance, where nursing data is informing policies, staffing methodologies and forecasting within their organisations. Assisting organisations to understand their acuity data, strengthening user confidence in using electronic patient acuity systems, and ensuring nursing and midwifery workload is accurately reflected is critical to the success of the safe staffing programme. Nurses and midwives have the capacity via an acuity tool to become key informers of organisational planning. Quality patient care, best use of health resources and a quality work environment are essential components of a safe, resilient and well resourced organisation. Nurses are the key informers of this information. In New Zealand a national level approach is paving the way for significant changes to the understanding and use of patient acuity and nursing workload information.

Keywords: nursing workload, patient acuity, safe staffing, New Zealand

Procedia PDF Downloads 382
9629 Human-Automation Interaction in Law: Mapping Legal Decisions and Judgments, Cognitive Processes, and Automation Levels

Authors: Dovile Petkeviciute-Barysiene

Abstract:

Legal technologies not only create new ways for accessing and providing legal services but also transform the role of legal practitioners. Both lawyers and users of legal services expect automated solutions to outperform people with objectivity and impartiality. Although fairness of the automated decisions is crucial, research on assessing various characteristics of automated processes related to the perceived fairness has only begun. One of the major obstacles to this research is the lack of comprehensive understanding of what legal actions are automated and could be meaningfully automated, and to what extent. Neither public nor legal practitioners oftentimes cannot envision technological input due to the lack of general without illustrative examples. The aim of this study is to map decision making stages and automation levels which are and/or could be achieved in legal actions related to pre-trial and trial processes. Major legal decisions and judgments are identified during the consultations with legal practitioners. The dual-process model of information processing is used to describe cognitive processes taking place while making legal decisions and judgments during pre-trial and trial action. Some of the existing legal technologies are incorporated into the analysis as well. Several published automation level taxonomies are considered because none of them fit well into the legal context, as they were all created for avionics, teleoperation, unmanned aerial vehicles, etc. From the information processing perspective, analysis of the legal decisions and judgments expose situations that are most sensitive to cognitive bias, among others, also help to identify areas that would benefit from the automation the most. Automation level analysis, in turn, provides a systematic approach to interaction and cooperation between humans and algorithms. Moreover, an integrated map of legal decisions and judgments, information processing characteristics, and automation levels all together provide some groundwork for the research of legal technology perceived fairness and acceptance. Acknowledgment: This project has received funding from European Social Fund (project No 09.3.3-LMT-K-712-19-0116) under grant agreement with the Research Council of Lithuania (LMTLT).

Keywords: automation levels, information processing, legal judgment and decision making, legal technology

Procedia PDF Downloads 142
9628 Correlation between Microalbuminuria and Hypertension in Type 2 Diabetic Patients

Authors: Alia Ali, Azeem Taj, Muhammed Joher Amin, Farrukh Iqbal, Zafar Iqbal

Abstract:

Background: Hypertension is commonly found in patients with Diabetic Kidney Disease (DKD). Microalbuminuria is the first clinical sign of involvement of kidneys in patients with type 2 diabetes. Uncontrolled hypertension induces a higher risk of cardiovascular events, including death, increasing proteinuria and progression to kidney disease. Objectives: To determine the correlation between microalbuminuria and hypertension and their association with other risk factors in type 2 diabetic patients. Methods: One hundred and thirteen type 2 diabetic patients were screened for microalbuminuria and raised blood pressure, attending the diabetic clinic of Shaikh Zayed Hospital, Lahore, Pakistan. The study was conducted from November 2012 to June 2013. Results: Patients were divided into two groups. Group 1, those with normoalbuminuria (n=63) and Group 2, those having microalbuminuria (n=50). Group 2 patients showed higher blood pressure values as compared to Group 1. The results were statistically significant and showed poor glycemic control as a contributing risk factor. Conclusion: The study concluded that there is high frequency of hypertension among type 2 diabetics but still much higher among those having microalbuminuria. So, early recognition of renal dysfunction through detection of microalbuminuria and to start treatment without any delay will confer future protection from end-stage renal disease as well as hypertension and its complications in type 2 diabetic patients.

Keywords: hypertension, microalbuminuria, diabetic kidney disease, type 2 Diabetes mellitus

Procedia PDF Downloads 396
9627 Natural Honey and Effect on the Activity of the Cells

Authors: Abujnah Dukali

Abstract:

Natural honey was assessed in cell culture system for its anticancer activity. Human leukemic cell line HL 60 was treated with honey and cultured for 5 days and cytotoxicity was calculated by MTT assay. Honey showed cytotoxicity with CC50 value of 174.20 µg/ml. Radical modulation activities was assessed by lipid peroxidation assay using egg lecithin. Honey showed antioxidant activity with EC50 value of 159.73 µg/ml. In addition, treatment with HL60 cells also resulted in nuclear DNA fragmentation, as seen in agarose gel electrophoresis. This is a hallmark of cells undergoing apoptosis. Confirmation of apoptosis was performed by staining the cells with Annexin V and FACS analysis. Apoptosis is an active, genetically regulated disassembly of the cell form within. Disassembly creates changes in the phospholipid content of the cytoplasmic membrane outer leaflet. Phosphatidylserine (PS) is translocated from the inner to the outer surface of the cell for phagocytic cell recognition. The human anticoagulant, annexin V, is a Ca2+-dependent phospholipid protein with a high affinity for PS. Annexin V labeled with fluorescein can identify apoptotic cells in the population It is a confirmatory test for apoptosis. Annexin V-positive cells were defined as apoptotic cells. Since honey shows both antioxidant activity and cytotoxicity at almost the same concentration, it can prevent the free radical induced cancer as prophylactic agent and kill the cancer cells by apoptotic process as a chemotherapeutic agent. Everyday intake of honey can prevent the cancer induction.

Keywords: anticancer, cells, DNA, honey

Procedia PDF Downloads 206
9626 Investigating the Impact of Individual Risk-Willingness and Group-Interaction Effects on Business Model Innovation Decisions

Authors: Sarah Müller-Sägebrecht

Abstract:

Today’s volatile environment challenges executives to make the right strategic decisions to gain sustainable success. Entrepreneurship scholars postulate mainly positive effects of environmental changes on entrepreneurship behavior, such as developing new business opportunities, promoting ingenuity, and the satisfaction of resource voids. A strategic solution approach to overcome threatening environmental changes and catch new business opportunities is business model innovation (BMI). Although this research stream has gained further importance in the last decade, BMI research is still insufficient. Especially BMI barriers, such as inefficient strategic decision-making processes, need to be identified. Strategic decisions strongly impact organizational future and are, therefore, usually made in groups. Although groups draw on a more extensive information base than single individuals, group-interaction effects can influence the decision-making process - in a favorable but also unfavorable way. Decisions are characterized by uncertainty and risk, whereby their intensity is perceived individually differently. Individual risk-willingness influences which option humans choose. The special nature of strategic decisions, such as in BMI processes, is that these decisions are not made individually but in groups due to their high organizational scope. These groups consist of different personalities whose individual risk-willingness can vary considerably. It is known from group decision theory that these individuals influence each other, observable in different group-interaction effects. The following research questions arise: i) Which impact has the individual risk-willingness on BMI decisions? And ii) how do group interaction effects impact BMI decisions? After conducting 26 in-depth interviews with executives from the manufacturing industry, the applied Gioia methodology reveals the following results: i) Risk-averse decision-makers have an increased need to be guided by facts. The more information available to them, the lower they perceive uncertainty and the more willing they are to pursue a specific decision option. However, the results also show that social interaction does not change the individual risk-willingness in the decision-making process. ii) Generally, it could be observed that during BMI decisions, group interaction is primarily beneficial to increase the group’s information base for making good decisions, less than for social interaction. Further, decision-makers mainly focus on information available to all decision-makers in the team but less on personal knowledge. This work contributes to strategic decision-making literature twofold. First, it gives insights into how group-interaction effects influence an organization’s strategic BMI decision-making. Second, it enriches risk-management research by highlighting how individual risk-willingness impacts organizational strategic decision-making. To date, it was known in BMI research that risk aversion would be an internal BMI barrier. However, with this study, it becomes clear that it is not risk aversion that inhibits BMI. Instead, the lack of information prevents risk-averse decision-makers from choosing a riskier option. Simultaneously, results show that risk-averse decision-makers are not easily carried away by the higher risk-willingness of their team members. Instead, they use social interaction to gather missing information. Therefore, executives need to provide sufficient information to all decision-makers to catch promising business opportunities.

Keywords: business model innovation, decision-making, group biases, group decisions, group-interaction effects, risk-willingness

Procedia PDF Downloads 96
9625 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct

Procedia PDF Downloads 225
9624 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: fuzzy C-means clustering, fuzzy C-means clustering based attribute weighting, Pima Indians diabetes, SVM

Procedia PDF Downloads 414
9623 Independent Audit in Brazilian Companies Listed on B3: An Analysis of Companies That Received Qualified Opinion and Disclaimer of Opinion

Authors: Diego Saldo Alves, Marcelo Paveck Ayub

Abstract:

The quality of accounting information is very important for the decision-making of managers, investors government and other information users. The opinion of the independent audit has a significant influence on the decision-making, especially the investors. Therefore, the aim of this study is to analyze the reasons that companies listed on Brazilian Stock Exchange B3, if they received qualified opinion and disclaimer of opinion of the independent auditors. We analyzed the reports of the independent auditors of 23 Brazilian companies listed in B3 that received qualified opinion and disclaimer of opinion between the years 2012 and 2017. The findings show that the companies do not comply the International Financial Reporting Standard, IFRS, also they did not provide documentation to prove the operations performed, did not account expenses, problems in corporate governance and internal controls.

Keywords: audit, disclaimer of opinion, independent auditors, qualified opinion

Procedia PDF Downloads 193
9622 Searching Linguistic Synonyms through Parts of Speech Tagging

Authors: Faiza Hussain, Usman Qamar

Abstract:

Synonym-based searching is recognized to be a complicated problem as text mining from unstructured data of web is challenging. Finding useful information which matches user need from bulk of web pages is a cumbersome task. In this paper, a novel and practical synonym retrieval technique is proposed for addressing this problem. For replacement of semantics, user intent is taken into consideration to realize the technique. Parts-of-Speech tagging is applied for pattern generation of the query and a thesaurus for this experiment was formed and used. Comparison with Non-Context Based Searching, Context Based searching proved to be a more efficient approach while dealing with linguistic semantics. This approach is very beneficial in doing intent based searching. Finally, results and future dimensions are presented.

Keywords: natural language processing, text mining, information retrieval, parts-of-speech tagging, grammar, semantics

Procedia PDF Downloads 308
9621 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores

Authors: A. Ashraff

Abstract:

The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.

Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems

Procedia PDF Downloads 107
9620 Methodologies for Deriving Semantic Technical Information Using an Unstructured Patent Text Data

Authors: Jaehyung An, Sungjoo Lee

Abstract:

Patent documents constitute an up-to-date and reliable source of knowledge for reflecting technological advance, so patent analysis has been widely used for identification of technological trends and formulation of technology strategies. But, identifying technological information from patent data entails some limitations such as, high cost, complexity, and inconsistency because it rely on the expert’ knowledge. To overcome these limitations, researchers have applied to a quantitative analysis based on the keyword technique. By using this method, you can include a technological implication, particularly patent documents, or extract a keyword that indicates the important contents. However, it only uses the simple-counting method by keyword frequency, so it cannot take into account the sematic relationship with the keywords and sematic information such as, how the technologies are used in their technology area and how the technologies affect the other technologies. To automatically analyze unstructured technological information in patents to extract the semantic information, it should be transformed into an abstracted form that includes the technological key concepts. Specific sentence structure ‘SAO’ (subject, action, object) is newly emerged by representing ‘key concepts’ and can be extracted by NLP (Natural language processor). An SAO structure can be organized in a problem-solution format if the action-object (AO) states that the problem and subject (S) form the solution. In this paper, we propose the new methodology that can extract the SAO structure through technical elements extracting rules. Although sentence structures in the patents text have a unique format, prior studies have depended on general NLP (Natural language processor) applied to the common documents such as newspaper, research paper, and twitter mentions, so it cannot take into account the specific sentence structure types of the patent documents. To overcome this limitation, we identified a unique form of the patent sentences and defined the SAO structures in the patents text data. There are four types of technical elements that consist of technology adoption purpose, application area, tool for technology, and technical components. These four types of sentence structures from patents have their own specific word structure by location or sequence of the part of speech at each sentence. Finally, we developed algorithms for extracting SAOs and this result offer insight for the technology innovation process by providing different perspectives of technology.

Keywords: NLP, patent analysis, SAO, semantic-analysis

Procedia PDF Downloads 262
9619 Clustering Based and Centralized Routing Table Topology of Control Protocol in Mobile Wireless Sensor Networks

Authors: Mbida Mohamed, Ezzati Abdellah

Abstract:

A strong challenge in the wireless sensor networks (WSN) is to save the energy and have a long life time in the network without having a high rate of loss information. However, topology control (TC) protocols are designed in a way that the network is divided and having a standard system of exchange packets between nodes. In this article, we will propose a clustering based and centralized routing table protocol of TC (CBCRT) which delegates a leader node that will encapsulate a single routing table in every cluster nodes. Hence, if a node wants to send packets to the sink, it requests the information's routing table of the current cluster from the node leader in order to root the packet.

Keywords: mobile wireless sensor networks, routing, topology of control, protocols

Procedia PDF Downloads 274
9618 Global Emission Inventories of Air Pollutants from Combustion Sources

Authors: Shu Tao

Abstract:

Based on a global fuel consumption data product (PKU-FUEL-2007) compiled recently and a series of databases for emission factors of various sources, global emission inventories of a number of greenhouse gases and air pollutants, including CO2, CO, SO2, NOx, primary particulate matter (total, PM 10, and PM 2.5), black carbon, organic carbon, mercury, volatile organic carbons, and polycyclic aromatic hydrocarbons, from combustion sources have been developed. The inventories feather high spatial and sectorial resolutions. The spatial resolution of the inventories are 0.1 by 0.1 degree, based on a sub-national disaggregation approach to reduce spatial bias due to uneven distribution of per person fuel consumption within countries. The finely resolved inventories provide critical information for chemical transport modeling and exposure modeling. Emissions from more than 60 sources in energy, industry, agriculture, residential, transportation, and wildfire sectors were quantified in this study. With the detailed sectorial information, the inventories become an important tool for policy makers. For residential sector, a set of models were developed to simulate temporal variation of fuel consumption, consequently pollutant emissions. The models can be used to characterize seasonal as well as inter-annual variations in the emissions in history and to predict future changes. The models can even be used to quantify net change of fuel consumption and pollutant emissions due to climate change. The inventories has been used for model ambient air quality, population exposure, and even health effects. A few examples of the applications are discussed.

Keywords: air pollutants, combustion, emission inventory, sectorial information

Procedia PDF Downloads 369
9617 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments

Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein

Abstract:

Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.

Keywords: virtual reality, effective computing, effective VR, emotion-based effective physiological database

Procedia PDF Downloads 233
9616 Impact of Digitized Monitoring & Evaluation System in Technical Vocational Education and Training

Authors: Abdul Ghani Rajput

Abstract:

Although monitoring and evaluation concept adopted by Technical Vocational Education and Training (TVET) organization to track the progress over the continuous interval of time based on planned interventions and subsequently, evaluating it for the impact, quality assurance and sustainability. In digital world, TVET providers are giving preference to have real time information to do monitoring of training activities. Identifying the benefits and challenges of digitized monitoring & evaluation real time information system has not been sufficiently tackled in this date. This research paper looks at the impact of digitized M&E in TVET sector by analyzing two case studies and describe the benefits and challenges of using digitized M&E system. Finally, digitized M&E have been identified as carriers for high potential of TVET sector.

Keywords: digitized M&E, innovation, quality assurance, TVET

Procedia PDF Downloads 230
9615 Potentials and Influencing Factors of Dynamic Pricing in Business: Empirical Insights of European Experts

Authors: Christopher Reichstein, Ralf-Christian Härting, Martina Häußler

Abstract:

With a continuously increasing speed of information exchange on the World Wide Web, retailers in the E-Commerce sector are faced with immense possibilities regarding different online purchase processes like dynamic price settings. By use of Dynamic Pricing, retailers are able to set short time price changes in order to optimize producer surplus. The empirical research illustrates the basics of Dynamic Pricing and identifies six influencing factors of Dynamic Pricing. The results of a structural equation modeling approach show five main drivers increasing the potential of dynamic price settings in the E-Commerce. Influencing factors are the knowledge of customers’ individual willingness to pay, rising sales, the possibility of customization, the data volume and the information about competitors’ pricing strategy.

Keywords: e-commerce, empirical research, experts, dynamic pricing (DP), influencing factors, potentials

Procedia PDF Downloads 262
9614 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 128
9613 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 116
9612 Behavioral Assessment of the Role of Brain 5-HT4 Receptors on the Memory and Cognitive Performance in a Rat Model of Alzheimer Disease

Authors: Siamak Shahidi, Nasrin Hashemi-Firouzi, Sara Soleimani-Asl, Alireza Komaki

Abstract:

Introduction: Alzheimer's disease (AD) is a neurodegenerative disorder characterized by progressive memory and cognitive performance. Recently, an involvement of the serotonergic system and their receptors are suspected in the AD progression. In the present behavioral study, the effects of BIMU (selective 5-HT4 receptor agonist) on cognition and memory in the rat model of AD was investigated. Material and Methods: The animal model of the AD was induced by intracerebroventricular (Icv) injection of amyloid beta (Aβ) in adult male Wistar rats. Animals were divided into experimental groups included control, sham, Aβ, Aβ +BIMU groups. The treatment substances were icv injected (1 μg/μL) for thirty consecutive days. Then, novel object recognition (NOR) and passive avoidance learning (PAL) tests were applied to investigate memory and cognitive performance. Results: Aβ decrease the discrimination index of NOR test. Also, it increases the time spent in the dark compartment during PAL test, as compared with sham and control groups. In addition, compared to Aβ groups, BIMU significantly increased the discrimination index of NOR test and decreased the time spent in the dark compartment of PAL test. Conclusion: These findings suggest that 5-HT4 receptor activation prevents progression of memory and cognitive impairment in a rat model of AD.

Keywords: Alzheimer disease, cognition, memory, serotonin receptors

Procedia PDF Downloads 132