Search results for: cognitive complexity metric
3093 Understanding Chronic Pain: Missing the Mark
Authors: Rachid El Khoury
Abstract:
Chronic pain is perhaps the most burdensome health issue facing the planet. Our understanding of the pathophysiology of chronic pain has increased substantially over the past 25 years, including but not limited to changes in the brain. However, we still do not know why chronic pain develops in some people and not in others. Most of the recent developments in pain science, that have direct relevance to clinical management, relate to our understanding of the role of the brain, the role of the immune system, or the role of cognitive and behavioral factors. Although the Biopsychosocial model of pain management was presented decades ago, the Bio-reductionist model remains, unfortunately, at the heart of many practices across professional and geographic boundaries. A large body of evidence shows that nociception is neither sufficient nor necessary for pain. Pain is a conscious experience that can certainly be, and often is, associated with nociception, however, always modulated by countless neurobiological, environmental, and cognitive factors. This study will clarify the current misconceptions of chronic pain concepts, and their misperceptions by clinicians. It will also attempt to bridge the considerable gap between what we already know on pain but somehow disregarded, the development in pain science, and clinical practice.Keywords: chronic pain, nociception, biopsychosocial, neuroplasticity
Procedia PDF Downloads 633092 The Material-Process Perspective: Design and Engineering
Authors: Lars Andersen
Abstract:
The development of design and engineering in large construction projects are characterized by an increased degree of flattening out of formal structures, extended use of parallel and integrated processes (‘Integrated Concurrent Engineering’) and an increased number of expert disciplines. The integration process is based on ongoing collaborations, dialogues, intercommunication and comments on each other’s work (iterations). This process based on reciprocal communication between actors and disciplines triggers value creation. However, communication between equals is not in itself sufficient to create effective decision making. The complexity of the process and time pressure contribute to an increased risk of a deficit of decisions and loss of process control. The paper refers to a study that aims at developing a resilient decision-making system that does not come in conflict with communication processes based on equality between the disciplines in the process. The study includes the construction of a hospital, following the phases design, engineering and physical building. The Research method is a combination of formative process research, process tracking and phenomenological analyses. The study tracked challenges and problems in the building process to the projection substrates (drawing and models) and further to the organization of the engineering and design phase. A comparative analysis of traditional and new ways of organizing the projecting made it possible to uncover an implicit material order or structure in the process. This uncovering implied a development of a material process perspective. According to this perspective the complexity of the process is rooted in material-functional differentiation. This differentiation presupposes a structuring material (the skeleton of the building) that coordinates the other types of material. Each expert discipline´s competence is related to one or a set of materials. The architect, consulting engineer construction etc. have their competencies related to structuring material, and inherent in this; coordination competence. When dialogues between the disciplines concerning the coordination between them do not result in agreement, the disciplines with responsibility for the structuring material decide the interface issues. Based on these premises, this paper develops a self-organized expert-driven interdisciplinary decision-making system.Keywords: collaboration, complexity, design, engineering, materiality
Procedia PDF Downloads 2213091 Continuous Catalytic Hydrogenation and Purification for Synthesis Non-Phthalate
Authors: Chia-Ling Li
Abstract:
The scope of this article includes the production of 10,000 metric tons of non-phthalate per annum. The production process will include hydrogenation, separation, purification, and recycling of unprocessed feedstock. Based on experimental data, conversion and selectivity were chosen as reaction model parameters. The synthesis and separation processes of non-phthalate and phthalate were established by using Aspen Plus software. The article will be divided into six parts: estimation of physical properties, integration of production processes, purification case study, utility consumption, economic feasibility study and identification of bottlenecks. The purities of products was higher than 99.9 wt. %. Process parameters have important guiding significance to the commercialization of hydrogenation of phthalate.Keywords: economic analysis, hydrogenation, non-phthalate, process simulation
Procedia PDF Downloads 2773090 Presuppositions and Implicatures in Four Selected Speeches of Osama Bin Laden's Legitimisation of 'Jihad'
Authors: Sawsan Al-Saaidi, Ghayth K. Shaker Al-Shaibani
Abstract:
This paper investigates certain linguistics properties of four selected speeches by Al-Qaeda’s former leader Osama bin Laden who legitimated the use of jihad by Muslims in various countries when he was alive. The researchers adopt van Dijk’s (2009; 1998) Socio-Cognitive approach and Ideological Square theory respectively. Socio-Cognitive approach revolves around various cognitive, socio-political, and discursive aspects that can be found in political discourse as in Osama bin Laden’s one. The political discourse can be defined in terms of textual properties and contextual models. Pertaining to the ideological square, it refers to positive self-presentation and negative other-presentation which help to enhance the textual and contextual analyses. Therefore, among the most significant properties in Osama bin Laden’s discourse are the use of presuppositions and implicatures which are based on background knowledge and contextual models as well. Thus, the paper concludes that Osama bin Laden used a number of manipulative strategies which augmented and embellished the use of ‘jihad’ in order to develop a more effective discourse for his audience. In addition, the findings have revealed that bin Laden used different implicit and embedded interpretations of different topics which have been accepted as taken-for-granted truths for him to legitimate Jihad against his enemies. There are many presuppositions in the speeches analysed that result in particular common-sense assumptions and a world-view about the selected speeches. More importantly, the assumptions in the analysed speeches help consolidate the ideological analysis in terms of in-group and out-group members.Keywords: Al-Qaeda, cognition, critical discourse analysis, Osama Bin Laden, jihad, implicature, legitimisation, presupposition, political discourse
Procedia PDF Downloads 2403089 Enhancing Disaster Response Capabilities in Asia-Pacific: An Explorative Study Applied to Decision Support Tools for Logistics Network Design
Authors: Giuseppe Timperio, Robert de Souza
Abstract:
Logistics operations in the context of disaster response are characterized by a high degree of complexity due to the combined effect of a large number of stakeholders involved, time pressure, uncertainties at various levels, massive deployment of goods and personnel, and gigantic financial flow to be managed. It also involves several autonomous parties such as government agencies, militaries, NGOs, UN agencies, private sector to name few, to have a highly collaborative approach especially in the critical phase of the immediate response. This is particularly true in the context of L3 emergencies that are the most severe, large-scale humanitarian crises. Decision-making processes in disaster management are thus extremely difficult due to the presence of multiple decision-makers involved, and the complexity of the tasks being tackled. Hence, in this paper, we look at applying ICT based solutions to enable a speedy and effective decision making in the golden window of humanitarian operations. A high-level view of ICT based solutions in the context of logistics operations for humanitarian response in Southeast Asia is presented, and their viability in a real-life case about logistics network design is explored.Keywords: decision support, disaster preparedness, humanitarian logistics, network design
Procedia PDF Downloads 1693088 Limits of the Dot Counting Test: A Culturally Responsive Approach to Neuropsychological Evaluations and Treatment
Authors: Erin Curtis, Avraham Schwiger
Abstract:
Neuropsychological testing and evaluation is a crucial step in providing patients with effective diagnoses and treatment while in clinical care. The variety of batteries used in these evaluations can help clinicians better understand the nuanced declivities in a patient’s cognitive, behavioral, or emotional functioning, consequently equipping clinicians with the insights to make intentional choices about a patient’s care. Despite the knowledge these batteries can yield, some aspects of neuropsychological testing remain largely inaccessible to certain patient groups as a result of fundamental cultural, educational, or social differences. One such battery includes the Dot Counting Test (DCT), during which patients are required to count a series of dots on a page as rapidly and accurately as possible. As the battery progresses, the dots appear in clusters that are designed to be easily multiplied. This task evaluates a patient’s cognitive functioning, attention, and level of effort exerted on the evaluation as a whole. However, there is evidence to suggest that certain social groups, particularly Latinx groups, may perform worse on this task as a result of cultural or educational differences, not reduced cognitive functioning or effort. As such, this battery fails to account for baseline differences among patient groups, thus creating questions surrounding the accuracy, generalizability, and value of its results. Accessibility and cultural sensitivity are critical considerations in the testing and treatment of marginalized groups, yet have been largely ignored in the literature and in clinical settings to date. Implications and improvements to applications are discussed.Keywords: culture, latino, neuropsychological assessment, neuropsychology, accessibility
Procedia PDF Downloads 1143087 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department
Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee
Abstract:
During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management
Procedia PDF Downloads 3403086 The Effects of Three Levels of Contextual Inference among adult Athletes
Authors: Abdulaziz Almustafa
Abstract:
Considering the critical role permanence has on predictions related to the contextual interference effect on laboratory and field research, this study sought to determine whether the paradigm of the effect depends on the complexity of the skill during the acquisition and transfer phases. The purpose of the present study was to investigate the effects of contextual interference CI by extending previous laboratory and field research with adult athletes through the acquisition and transfer phases. Male (n=60) athletes age 18-22 years-old, were chosen randomly from Eastern Province Clubs. They were assigned to complete blocked, random, or serial practices. Analysis of variance with repeated measures MANOVA indicated that, the results did not support the notion of CI. There were no significant differences in acquisition phase between blocked, serial and random practice groups. During the transfer phase, there were no major differences between the practice groups. Apparently, due to the task complexity, participants were probably confused and not able to use the advantages of contextual interference. This is another contradictory result to contextual interference effects in acquisition and transfer phases in sport settings. One major factor that can influence the effect of contextual interference is task characteristics as the nature of level of difficulty in sport-related skill.Keywords: contextual interference, acquisition, transfer, task difficulty
Procedia PDF Downloads 4673085 Characterising Stable Model by Extended Labelled Dependency Graph
Authors: Asraful Islam
Abstract:
Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring
Procedia PDF Downloads 2153084 Poisson Type Spherically Symmetric Spacetimes
Authors: Gonzalo García-Reyes
Abstract:
Conformastat spherically symmetric exact solutions of Einstein's field equations representing matter distributions made of fluid both perfect and anisotropic from given solutions of Poisson's equation of Newtonian gravity are investigated. The approach is used in the construction of new relativistic models of thick spherical shells and three-component models of galaxies (bulge, disk, and dark matter halo), writing, in this case, the metric in cylindrical coordinates. In addition, the circular motion of test particles (rotation curves) along geodesics on the equatorial plane of matter configurations and the stability of the orbits against radial perturbations are studied. The models constructed satisfy all the energy conditions.Keywords: general relativity, exact solutions, spherical symmetry, galaxy, kinematics and dynamics, dark matter
Procedia PDF Downloads 883083 Modifying Hawking Radiation in 2D-Approximated Schwarzschild Black Holes near the Event Horizon
Authors: Richard Pincak
Abstract:
Starting from a 4D spacetime model using a partially negative dimensional product manifold (PNDP-manifold), which emerges as a 2D spacetime, we developed an analysis of tidal forces and Hawking radiation near the event horizon of a Schwarzchild black hole. The modified 2D metric, incorporating the effects of the four-dimensional Weyl tensor, with the dilatonic field and the newly derived time relation \(2\alpha t = \ln \epsilon\), can enable a deeper understanding of quantum gravity. The analysis shows how the modified Hawking temperature and distribution of emitted particles are affected by additional fields, providing potential observables for future experiments.Keywords: black holes, Hawking radiation, Weyl tensor, information paradox
Procedia PDF Downloads 223082 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI
Authors: Genady Grabarnik, Serge Yaskolko
Abstract:
Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education
Procedia PDF Downloads 583081 Weakly Solving Kalah Game Using Artificial Intelligence and Game Theory
Authors: Hiba El Assibi
Abstract:
This study aims to weakly solve Kalah, a two-player board game, by developing a start-to-finish winning strategy using an optimized Minimax algorithm with Alpha-Beta Pruning. In weakly solving Kalah, our focus is on creating an optimal strategy from the game's beginning rather than analyzing every possible position. The project will explore additional enhancements like symmetry checking and code optimizations to speed up the decision-making process. This approach is expected to give insights into efficient strategy formulation in board games and potentially help create games with a fair distribution of outcomes. Furthermore, this research provides a unique perspective on human versus Artificial Intelligence decision-making in strategic games. By comparing the AI-generated optimal moves with human choices, we can explore how seemingly advantageous moves can, in the long run, be harmful, thereby offering a deeper understanding of strategic thinking and foresight in games. Moreover, this paper discusses the evaluation of our strategy against existing methods, providing insights on performance and computational efficiency. We also discuss the scalability of our approach to the game, considering different board sizes (number of pits and stones) and rules (different variations) and studying how that affects performance and complexity. The findings have potential implications for the development of AI applications in strategic game planning, enhancing our understanding of human cognitive processes in game settings, and offer insights into creating balanced and engaging game experiences.Keywords: minimax, alpha beta pruning, transposition tables, weakly solving, game theory
Procedia PDF Downloads 553080 Managing Psychogenic Non-Epileptic Seizure Disorder: The Benefits of Collaboration between Psychiatry and Neurology
Authors: Donald Kushon, Jyoti Pillai
Abstract:
Psychogenic Non-epileptic Seizure Disorder (PNES) is a challenging clinical problem for the neurologist. This study explores the benefits of on-site collaboration between psychiatry and neurology in the management of PNES. A 3 month period at a university hospital seizure clinic is described detailing specific management approaches taken as a result of this collaboration. This study describes four areas of interest: (1. After the video EEG results confirm the diagnosis of PNES, the presentation of the diagnosis of PNES to the patient. (2. The identification of co-morbid psychiatric illness (3. Treatment with specific psychotherapeutic interventions (including Cognitive Behavioral Therapy) and psychopharmacologic interventions (primarily SSRIs) and (4. Preliminary treatment outcomes.Keywords: cognitive behavioral therapy (CBT), psychogenic non-epileptic seizure disorder (PNES), selective serotonin reuptake inhibitors (SSRIs), video electroencephalogram (VEEG)
Procedia PDF Downloads 3153079 Executive Function Assessment with Aboriginal Australians
Authors: T. Keiller, E. Hindman, P. Hassmen, K. Radford, L. Lavrencic
Abstract:
Background: Psychosocial disadvantage is associated with impaired cognitive abilities, with executive functioning (EF) abilities particularly vulnerable. EF abilities strongly predict general daily functioning, educational and career prospects, and health choices. A reliable and valid assessment of EF is important to support appropriate care and intervention strategies. However, evidence-based EF assessment tools for use with Aboriginal Australians are limited. Aim and Method: This research aims to develop and validate a culturally appropriate EF tool for use with indigenous Australians. To this end, Study One aims to review current literature examining the benefits and disadvantages of current EF assessment tools for use with Indigenous Australians. Study Two aims to collate expert opinion on the strengths and weaknesses of various current EF assessment tools for use with Indigenous Australians using Delphi methodology with experienced psychologists (n = 10). The initial two studies will inform the development of a culturally appropriate assessment tool. Study Three aims to evaluate the psychometric properties of the tool with an Indigenous sample living in the New South Wales Mid-North Coast. The study aims to quantify the predictive validity of this tool via comparison to functionality predictors and neuropsychological assessment scores. Study Four aims to collect qualitative data surrounding the feasibility and acceptability of the tool among indigenous Australians and health professionals. Expected Results: Findings from this research are likely to inform cognitive assessment practices and tool selection for health professionals conducting cognitive assessments with Indigenous Australians. Improved assessment of EF will inform appropriate care and intervention strategies for individuals with EF deficits.Keywords: aboriginal Australians, assessment tool, cognition, executive functioning
Procedia PDF Downloads 2813078 Chronic Cognitive Impacts of Mild Traumatic Brain Injury during Aging
Authors: Camille Charlebois-Plante, Marie-Ève Bourassa, Gaelle Dumel, Meriem Sabir, Louis De Beaumont
Abstract:
To the extent of our knowledge, there has been little interest in the chronic effects of mild traumatic brain injury (mTBI) on cognition during normal aging. This is rather surprising considering the impacts on daily and social functioning. In addition, sustaining a mTBI during late adulthood may increase the effect of normal biological aging in individuals who consider themselves normal and healthy. The objective of this study was to characterize the persistent neuropsychological repercussions of mTBI sustained during late adulthood, on average 12 months prior to testing. To this end, 35 mTBI patients and 42 controls between the ages of 50 and 69 completed an exhaustive neuropsychological assessment lasting three hours. All mTBI patients were asymptomatic and all participants had a score ≥ 27 at the MoCA. The evaluation consisted of 20 standardized neuropsychological tests measuring memory, attention, executive and language functions, as well as information processing speed. Performance on tests of visual (Brief Visuospatial Memory Test Revised) and verbal memory (Rey Auditory Verbal Learning Test and WMS-IV Logical Memory subtest), lexical access (Boston Naming Test) and response inhibition (Stroop) revealed to be significantly lower in the mTBI group. These findings suggest that a mTBI sustained during late adulthood induces lasting effects on cognitive function. Episodic memory and executive functions seem to be particularly vulnerable to enduring mTBI effects.Keywords: cognitive function, late adulthood, mild traumatic brain injury, neuropsychology
Procedia PDF Downloads 1693077 Cognitive Emotion Regulation Strategies in 9–14-Year-Old Hungarian Children with Neurotypical Development in the Light of the Hungarian Version of Cognitive Emotion Regulation Questionnaire for Children
Authors: Dorottya Horváth, Andras Lang, Diana Varro-Horvath
Abstract:
This research activity and study is part of a major research effort to gain an integrative, neuropsychological, and personality psychological understanding of Attention Deficit Hyperactivity Disorder (ADHD) and thus improve the specification of diagnostic and therapeutic care. In the past, the neuropsychology section has investigated working memory, executive function, attention, and behavioural manifestations in children. Currently, we are looking for personality psychological protective factors for ADHD and its symptomatic exacerbation. We hypothesise that secure attachment, adaptive emotion regulation, and high resilience are protective factors. The aim of this study is to measure and report the results of a Hungarian sample of the Cognitive Emotion Regulation Questionnaire for Children (CERQ-k) because before studying groups with different developmental differences, it is essential to know the average scores of groups with neurotypical devel-opment. Until now, there was no Hungarian version of the above test, so we used our own translation. This questionnaire has been developed to assess children's thoughts after experiencing negative life events. It consists of 4-4 items per subscale, for a total of 36 items. The response categories for each item range from 1 (almost never) to 5 (almost always). The subscales were self-blame, blaming others, acceptance, planning, positive refocusing, rumination or thought-focusing, positive reappraisal, putting into perspective, and catastrophizing. The data for this study were collected from 120 children aged 9-14 years. It was analysed using descriptive statistical analysis, where the mean and standard deviation values for each age group, as well as the Cronbach's alpha value, were significant in testing the reliability of the questionnaire. The results showed that the questionnaire is a reliable and valid measuring instrument also on a Hungarian sample. These developments and results will allow the use of a version of the Cognitive Emotion Regulation Questionnaire for children in Hungarian and pave the way for the study of different developmental groups such as children with learning disabilities and/or with ADHD.Keywords: neurotypical development, emotion regulation, negative life events, CERQ-k, Hungarian average scores
Procedia PDF Downloads 773076 An Improved Data Aided Channel Estimation Technique Using Genetic Algorithm for Massive Multi-Input Multiple-Output
Authors: M. Kislu Noman, Syed Mohammed Shamsul Islam, Shahriar Hassan, Raihana Pervin
Abstract:
With the increasing rate of wireless devices and high bandwidth operations, wireless networking and communications are becoming over crowded. To cope with such crowdy and messy situation, massive MIMO is designed to work with hundreds of low costs serving antennas at a time as well as improve the spectral efficiency at the same time. TDD has been used for gaining beamforming which is a major part of massive MIMO, to gain its best improvement to transmit and receive pilot sequences. All the benefits are only possible if the channel state information or channel estimation is gained properly. The common methods to estimate channel matrix used so far is LS, MMSE and a linear version of MMSE also proposed in many research works. We have optimized these methods using genetic algorithm to minimize the mean squared error and finding the best channel matrix from existing algorithms with less computational complexity. Our simulation result has shown that the use of GA worked beautifully on existing algorithms in a Rayleigh slow fading channel and existence of Additive White Gaussian Noise. We found that the GA optimized LS is better than existing algorithms as GA provides optimal result in some few iterations in terms of MSE with respect to SNR and computational complexity.Keywords: channel estimation, LMMSE, LS, MIMO, MMSE
Procedia PDF Downloads 1923075 Preliminary Results on a Maximum Mean Discrepancy Approach for Seizure Detection
Authors: Boumediene Hamzi, Turky N. AlOtaiby, Saleh AlShebeili, Arwa AlAnqary
Abstract:
We introduce a data-driven method for seizure detection drawing on recent progress in Machine Learning. The method is based on embedding probability measures in a high (or infinite) dimensional reproducing kernel Hilbert space (RKHS) where the Maximum Mean Discrepancy (MMD) is computed. The MMD is metric between probability measures that are computed as the difference between the means of probability measures after being embedded in an RKHS. Working in RKHS provides a convenient, general functional-analytical framework for theoretical understanding of data. We apply this approach to the problem of seizure detection.Keywords: kernel methods, maximum mean discrepancy, seizure detection, machine learning
Procedia PDF Downloads 2383074 Bug Localization on Single-Line Bugs of Apache Commons Math Library
Authors: Cherry Oo, Hnin Min Oo
Abstract:
Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.Keywords: software testing, bug localization, program spectra, bug
Procedia PDF Downloads 1433073 On Virtual Coordination Protocol towards 5G Interference Mitigation: Modelling and Performance Analysis
Authors: Bohli Afef
Abstract:
The fifth-generation (5G) wireless systems is featured by extreme densities of cell stations to overcome the higher future demand. Hence, interference management is a crucial challenge in 5G ultra-dense cellular networks. In contrast to the classical inter-cell interference coordination approach, which is no longer fit for the high density of cell-tiers, this paper proposes a novel virtual coordination based on the dynamic common cognitive monitor channel protocol to deal with the inter-cell interference issue. A tractable and flexible model for the coverage probability of a typical user is developed through the use of the stochastic geometry model. The analyses of the performance of the suggested protocol are illustrated both analytically and numerically in terms of coverage probability.Keywords: ultra dense heterogeneous networks, dynamic common channel protocol, cognitive radio, stochastic geometry, coverage probability
Procedia PDF Downloads 3263072 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models
Authors: Benbiao Song, Yan Gao, Zhuo Liu
Abstract:
Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram
Procedia PDF Downloads 2643071 Analysis of Computer Science Papers Conducted by Board of Intermediate and Secondary Education at Secondary Level
Authors: Ameema Mahroof, Muhammad Saeed
Abstract:
The purpose of this study was to analyze the papers of computer science conducted by Board of Intermediate and Secondary Education with reference to Bloom’s taxonomy. The present study has two parts. First, the analysis is done on the papers conducted by Board of Intermediate of Secondary Education on the basis of basic rules of item construction especially Bloom’s (1956). And the item analysis is done to improve the psychometric properties of a test. The sample included the question papers of computer science of higher secondary classes (XI-XII) for the years 2011 and 2012. For item analysis, the data was collected from 60 students through convenient sampling. Findings of the study revealed that in the papers by Board of intermediate and secondary education the maximum focus was on knowledge and understanding level and very less focus was on the application, analysis, and synthesis. Furthermore, the item analysis on the question paper reveals that item difficulty of most of the questions did not show a balanced paper, the items were either very difficult while most of the items were too easy (measuring knowledge and understanding abilities). Likewise, most of the items were not truly discriminating the high and low achievers; four items were even negatively discriminating. The researchers also analyzed the items of the paper through software Conquest. These results show that the papers conducted by Board of Intermediate and Secondary Education were not well constructed. It was recommended that paper setters should be trained in developing the question papers that can measure various cognitive abilities of students so that a good paper in computer science should assess all cognitive abilities of students.Keywords: Bloom’s taxonomy, question paper, item analysis, cognitive domain, computer science
Procedia PDF Downloads 1523070 Connectomic Correlates of Cerebral Microhemorrhages in Mild Traumatic Brain Injury Victims with Neural and Cognitive Deficits
Authors: Kenneth A. Rostowsky, Alexander S. Maher, Nahian F. Chowdhury, Andrei Irimia
Abstract:
The clinical significance of cerebral microbleeds (CMBs) due to mild traumatic brain injury (mTBI) remains unclear. Here we use magnetic resonance imaging (MRI), diffusion tensor imaging (DTI) and connectomic analysis to investigate the statistical association between mTBI-related CMBs, post-TBI changes to the human connectome and neurological/cognitive deficits. This study was undertaken in agreement with US federal law (45 CFR 46) and was approved by the Institutional Review Board (IRB) of the University of Southern California (USC). Two groups, one consisting of 26 (13 females) mTBI victims and another comprising 26 (13 females) healthy control (HC) volunteers were recruited through IRB-approved procedures. The acute Glasgow Coma Scale (GCS) score was available for each mTBI victim (mean µ = 13.2; standard deviation σ = 0.4). Each HC volunteer was assigned a GCS of 15 to indicate the absence of head trauma at the time of enrollment in our study. Volunteers in the HC and mTBI groups were matched according to their sex and age (HC: µ = 67.2 years, σ = 5.62 years; mTBI: µ = 66.8 years, σ = 5.93 years). MRI [including T1- and T2-weighted volumes, gradient recalled echo (GRE)/susceptibility weighted imaging (SWI)] and gradient echo (GE) DWI volumes were acquired using the same MRI scanner type (Trio TIM, Siemens Corp.). Skull-stripping and eddy current correction were implemented. DWI volumes were processed in TrackVis (http://trackvis.org) and 3D Slicer (http://www.slicer.org). Tensors were fit to DWI data to perform DTI, and tractography streamlines were then reconstructed using deterministic tractography. A voxel classifier was used to identify image features as CMB candidates using Microbleed Anatomic Rating Scale (MARS) guidelines. For each peri-lesional DTI streamline bundle, the null hypothesis was formulated as the statement that there was no neurological or cognitive deficit associated with between-scan differences in the mean FA of DTI streamlines within each bundle. The statistical significance of each hypothesis test was calculated at the α = 0.05 level, subject to the family-wise error rate (FWER) correction for multiple comparisons. Results: In HC volunteers, the along-track analysis failed to identify statistically significant differences in the mean FA of DTI streamline bundles. In the mTBI group, significant differences in the mean FA of peri-lesional streamline bundles were found in 21 out of 26 volunteers. In those volunteers where significant differences had been found, these differences were associated with an average of ~47% of all identified CMBs (σ = 21%). In 12 out of the 21 volunteers exhibiting significant FA changes, cognitive functions (memory acquisition and retrieval, top-down control of attention, planning, judgment, cognitive aspects of decision-making) were found to have deteriorated over the six months following injury (r = -0.32, p < 0.001). Our preliminary results suggest that acute post-TBI CMBs may be associated with cognitive decline in some mTBI patients. Future research should attempt to identify mTBI patients at high risk for cognitive sequelae.Keywords: traumatic brain injury, magnetic resonance imaging, diffusion tensor imaging, connectomics
Procedia PDF Downloads 1713069 Arginase Enzyme Activity in Human Serum as a Marker of Cognitive Function: The Role of Inositol in Combination with Arginine Silicate
Authors: Katie Emerson, Sara Perez-Ojalvo, Jim Komorowski, Danielle Greenberg
Abstract:
The purpose of this study was to evaluate arginase activity levels in response to combinations of an inositol-stabilized arginine silicate (ASI; Nitrosigine®), L-arginine, and Inositol. Arginine acts as a vasodilator that promotes increased blood flow resulting in enhanced delivery of oxygen and nutrients to the brain and other tissues. ASI alone has been shown to improve performance on cognitive tasks. Arginase, found in human serum, catalyzes the conversion of arginine to ornithine and urea, completing the last step in the urea cycle. Decreasing arginase levels maintains arginine and results in increased nitric oxide production. This study aimed to determine the most effective combination of ASI, L-arginine and inositol for minimizing arginase levels and therefore maximize ASI’s effect on cognition. Serum was taken from untreated healthy donors by separation from clotted factors. Arginase activity of serum in the presence or absence of test products was determined (QuantiChrom™, DARG-100, Bioassay Systems, Hayward CA). The remaining ultra-filtrated serum units were harvested and used as the source for the arginase enzyme. ASI alone or combined with varied levels of Inositol were tested as follows: ASI + inositol at 0.25 g, 0.5 g, 0.75 g, or 1.00 g. L-arginine was also tested as a positive control. All tests elicited changes in arginase activity demonstrating the efficacy of the method used. Adding L-arginine to serum from untreated subjects, with or without inositol only had a mild effect. Adding inositol at all levels reduced arginase activity. Adding 0.5 g to the standardized amount of ASI led to the lowest amount of arginase activity as compared to the 0.25g 0.75g or 1.00g doses of inositol or to L-arginine alone. The outcome of this study demonstrates an interaction of the pairing of inositol with ASI on the activity of the enzyme arginase. We found that neither the maximum nor minimum amount of inositol tested in this study led to maximal arginase inhibition. Since the inhibition of arginase activity is desirable for product formulations looking to maintain arginine levels, the most effective amount of inositol was deemed preferred. Subsequent studies suggest this moderate level of inositol in combination with ASI leads to cognitive improvements including reaction time, executive function, and concentration.Keywords: arginine, inositol, arginase, cognitive benefits
Procedia PDF Downloads 1133068 How Different Perceived Affordances of Game Elements Shape Motivation and Performance in Gamified Learning: A Cognitive Evaluation Theory Perspective
Authors: Kibbeum Na
Abstract:
Previous gamification research has produced mixed results regarding the effectiveness of gamified learning. One possible explanation for this is that individuals perceive the game elements differently. Cognitive Evaluation Theory posits that external rewards can boost or undermine intrinsic motivation, depending on whether the rewards are perceived as informational or controlling. This research tested the hypothesis that game elements can be perceived as either informational feedback or external reward, and the motivational impact differ accordingly. An experiment was conducted using an educational math puzzle to compare the motivation and performance as a result of different perceived affordances game elements. Participants were primed to perceive the game elements as either informational feedback or external reward, and the duration of an attempt to solve the unsolvable puzzle – amotivation indicator – and the puzzle score – a performance indicator–were measured with the game elements incorporated and then without the game elements. Badges and points were deployed as the main game elements. Results showed that, regardless of priming, a significant decrease in performance occurred when the game elements were removed, whereas the control group who solved non-gamified math puzzles maintained their performance. The undermined performance with gamification removal indicates that learners may perceive some game elements as controlling factors irrespective of the way they are presented. The results of the current study also imply that some game elements are better not being implemented to preserve long-term performance. Further research delving into the extrinsic reward-like nature of game elements and its impact on learning motivation is called for.Keywords: cognitive Evaluation Theory, game elements, gamification, motivation, motivational affordance, performance
Procedia PDF Downloads 1083067 Managing Information Technology: An Overview of Information Technology Governance
Authors: Mehdi Asgarkhani
Abstract:
Today, investment on Information Technology (IT) solutions in most organizations is the largest component of capital expenditure. As capital investment on IT continues to grow, IT managers and strategists are expected to develop and put in practice effective decision making models (frameworks) that improve decision-making processes for the use of IT in organizations and optimize the investment on IT solutions. To be exact, there is an expectation that organizations not only maximize the benefits of adopting IT solutions but also avoid the many pitfalls that are associated with rapid introduction of technological change. Different organizations depending on size, complexity of solutions required and processes used for financial management and budgeting may use different techniques for managing strategic investment on IT solutions. Decision making processes for strategic use of IT within organizations are often referred to as IT Governance (or Corporate IT Governance). This paper examines IT governance - as a tool for best practice in decision making about IT strategies. Discussions in this paper represent phase I of a project which was initiated to investigate trends in strategic decision making on IT strategies. Phase I is concerned mainly with review of literature and a number of case studies, establishing that the practice of IT governance, depending on the complexity of IT solutions, organization's size and organization's stage of maturity, varies significantly – from informal approaches to sophisticated formal frameworks.Keywords: IT governance, corporate governance, IT governance frameworks, IT governance components, aligning IT with business strategies
Procedia PDF Downloads 4073066 The Dark Triad’s Moral Labyrinth: Differentiating Cognitive Processes Involved in Machiavellianism and Psychopathy
Authors: Megan E. Davies
Abstract:
With the intention of identifying cognitive processes uniquely involved in the dark triad personality traits of psychopathy, Machiavellianism, and narcissism, this study aimed to determine further potential differences and parameters of individual traits by explaining a statistically significant amount of variance between the constructs of manipulativeness, impulsiveness, grit, and need for cognition within the dark triad. Applying a cross-sectional design, N = 96 participants self-reported using the MACH-IV, SRP-III, NFC-S, and Grit Scale for Perseverance and Passion for Long-Term Goals. Hierarchical regression analyses showed that only manipulativeness predicted Machiavellianism, whereas manipulativeness and impulsiveness were found to have predictive qualities for psychopathy. Overall, these results found areas of discrepancy and overlap between manipulation and impulsivity regarding psychopathy and Machiavellianism. Additionally, this study serves to preliminarily eliminate the Need for Cognition and grit as predictive variables for Machiavellianism and psychopathy.Keywords: Machiavellianism, psychopathy, manipulation, impulsiveness, need for cognition, grit, dark triad
Procedia PDF Downloads 1133065 Translating Empathy in a Senior Community
Authors: Denver E. Severt, Cynthia Mejia
Abstract:
With a grey wave sweeping across the world and people living longer than ever, more individuals will reside in retirement communities in unprecedented numbers. Enhancing the resident stay within these communities is imperative to reduce past stigmas associated with senior communities. This exploratory quantitative investigation examined interview contents of employees and residents to see if empathy was observed. The results showed the employees across all ranges had a much better grasp of affective empathy, yet with greater experience and age, it was clear that cognitive empathy had to be used with affective empathy in order to gain better trust across the community of residents. Outcomes from the study suggest that future training programs for employees are operationalized to include both affective and cognitive empathy practices. This study is unique in that two scales of empathy were transformed into qualitative questions, and in-depth employee and resident interviews were conducted. The study answers many calls of research to provide more specific studies in senior living communities.Keywords: senior living community, transformational service research, qualitative research
Procedia PDF Downloads 1433064 Responsibility Attitude and Interpretation in Obsessive-Compulsive Disorder
Authors: Ryotaro Ishikawa
Abstract:
Obsessive-Compulsive Disorder (OCD) is a common, chronic and long-lasting disorder in which a person has uncontrollable, reoccurring thoughts (obsessions) and behaviors (compulsions) that he or she feels the urge to repeat over and over. Inflated responsibility attitude and interpretation are central beliefs in a cognitive model of OCD. This study aimed to develop a Japanese version of the Responsibility Attitude Scale (RAS-J) and Responsibility Interpretation Questionnaire (RIQ-J). 98 participants (OCD group = 37; anxiety control group = 24; healthy control group = 37) completed the RAS-J, RIQ-J and other measures to assess the validity of the RAS-J and RIQ-J. As a result of analysis, both scales had adequate concurrent validity, demonstrated by significant correlations with other measures of OCD, anxiety, and depression. Group comparison data using ANOVA with Bonferroni method indicated that RAS-J and RIQ-J scores for the OCD group not only differed from the nonclinical group, but also from the clinically anxious comparison group. In conclusion, this study indicated that the developed RAS-J and RIQ-J effectively measure responsibility attitude and responsibility interpretation in the Japanese population.Keywords: obsessive-compulsive disorder, responsibility, cognitive theory, anxiety disorder
Procedia PDF Downloads 275