Search results for: semantic memory
359 Computational Team Dynamics in Student New Product Development Teams
Authors: Shankaran Sitarama
Abstract:
Teamwork is an extremely effective pedagogical tool in engineering education. New Product Development (NPD) has been an effective strategy of companies to streamline and bring innovative products and solutions to customers. Thus, Engineering curriculum in many schools, some collaboratively with business schools, have brought NPD into the curriculum at the graduate level. Teamwork is invariably used during instruction, where students work in teams to come up with new products and solutions. There is a significant emphasis of grade on the semester long teamwork for it to be taken seriously by students. As the students work in teams and go through this process to develop the new product prototypes, their effectiveness and learning to a great extent depends on how they function as a team and go through the creative process, come together, and work towards the common goal. A core attribute of a successful NPD team is their creativity and innovation. The team needs to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas resulting in a POC (proof-of-concept) implementation or a prototype of the product. The simultaneous requirement of teams to be creative and at the same time also converge and work together imposes different types of tensions in their team interactions. These ideational tensions / conflicts and sometimes relational tensions / conflicts are inevitable. Effective teams will have to deal with the Team dynamics and manage it to be resilient enough and yet be creative. This research paper provides a computational analysis of the teams’ communication that is reflective of the team dynamics, and through a superimposition of latent semantic analysis with social network analysis, provides a computational methodology of arriving at patterns of visual interaction. These team interaction patterns have clear correlations to the team dynamics and provide insights into the functioning and thus the effectiveness of the teams. 23 student NPD teams over 2 years of a course on Managing NPD that has a blend of engineering and business school students is considered, and the results are presented. It is also correlated with the teams’ detailed and tailored individual and group feedback and self-reflection and evaluation questionnaire.Keywords: team dynamics, social network analysis, team interaction patterns, new product development teamwork, NPD teams
Procedia PDF Downloads 117358 Hippocampus Proteomic of Major Depression and Antidepressant Treatment: Involvement of Cell Proliferation, Differentiation, and Connectivity
Authors: Dhruv J. Limaye, Hanga Galfalvy, Cheick A. Sissoko, Yung-yu Huang, Chunanning Tang, Ying Liu, Shu-Chi Hsiung, Andrew J. Dwork, Gorazd B. Rosoklija, Victoria Arango, Lewis Brown, J. John Mann, Maura Boldrini
Abstract:
Memory and emotion require hippocampal cell viability and connectivity and are disrupted in major depressive disorder (MDD). Applying shotgun proteomics and stereological quantification of neural progenitor cells (NPCs), intermediate neural progenitors (INPs), and mature granule neurons (GNs), to postmortem human hippocampus, identified differentially expressed proteins (DEPs), and fewer NPCs, INPs and GNs, in untreated MDD (uMDD) compared with non-psychiatric controls (CTRL) and antidepressant-treated MDD (MDDT). DEPs lower in uMDD vs. CTRL promote mitosis, differentiation, and prevent apoptosis. DEPs higher in uMDD vs. CTRL inhibit the cell cycle, and regulate cell adhesion, neurite outgrowth, and DNA repair. DEPs lower in MDDT vs. uMDD block cell proliferation. We observe group-specific correlations between numbers of NPCs, INPs, and GNs and an abundance of proteins regulating mitosis, differentiation, and apoptosis. Altered protein expression underlies hippocampus cellular and volume loss in uMDD, supports a trophic effect of antidepressants, and offers new treatment targets.Keywords: proteomics, hippocampus, depression, mitosis, migration, differentiation, mitochondria, apoptosis, antidepressants, human brain
Procedia PDF Downloads 101357 The Amount of Conformity of Persian Subject Headlines with Users' Social Tagging
Authors: Amir Reza Asnafi, Masoumeh Kazemizadeh, Najmeh Salemi
Abstract:
Due to the diversity of information resources in the web0.2 environment, which is increasing in number from time to time, the social tagging system should be used to discuss Internet resources. Studying the relevance of social tags to thematic headings can help enrich resources and make them more accessible to resources. The present research is of applied-theoretical type and research method of content analysis. In this study, using the listing method and content analysis, the level of accurate, approximate, relative, and non-conformity of social labels of books available in the field of information science and bibliography of Kitabrah website with Persian subject headings was determined. The exact matching of subject headings with social tags averaged 22 items, the approximate matching of subject headings with social tags averaged 36 items, the relative matching of thematic headings with social tags averaged 36 social items, and the average matching titles did not match the title. The average is 116. According to the findings, the exact matching of subject headings with social labels is the lowest and the most inconsistent. This study showed that the average non-compliance of subject headings with social labels is even higher than the sum of the three types of exact, relative, and approximate matching. As a result, the relevance of thematic titles to social labels is low. Due to the fact that the subject headings are in the form of static text and users are not allowed to interact and insert new selected words and topics, and on the other hand, in websites based on Web 2 and based on the social classification system, this possibility is available for users. An important point of the present study and the studies that have matched the syntactic and semantic matching of social labels with thematic headings is that the degree of conformity of thematic headings with social labels is low. Therefore, these two methods can complement each other and create a hybrid cataloging that includes subject headings and social tags. The low level of conformity of thematic headings with social tags confirms the results of backgrounds and writings that have compared the social tags of books with the thematic headings of the Library of Congress. It is not enough to match social labels with thematic headings. It can be said that these two methods can be complementary.Keywords: Web 2/0, social tags, subject headings, hybrid cataloging
Procedia PDF Downloads 162356 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform
Authors: Khadija Refouh
Abstract:
Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms
Procedia PDF Downloads 151355 Portable Cardiac Monitoring System Based on Real-Time Microcontroller and Multiple Communication Interfaces
Authors: Ionel Zagan, Vasile Gheorghita Gaitan, Adrian Brezulianu
Abstract:
This paper presents the contributions in designing a mobile system named Tele-ECG implemented for remote monitoring of cardiac patients. For a better flexibility of this application, the authors chose to implement a local memory and multiple communication interfaces. The project described in this presentation is based on the ARM Cortex M0+ microcontroller and the ADAS1000 dedicated chip necessary for the collection and transmission of Electrocardiogram signals (ECG) from the patient to the microcontroller, without altering the performances and the stability of the system. The novelty brought by this paper is the implementation of a remote monitoring system for cardiac patients, having a real-time behavior and multiple interfaces. The microcontroller is responsible for processing digital signals corresponding to ECG and also for the implementation of communication interface with the main server, using GSM/Bluetooth SIMCOM SIM800C module. This paper translates all the characteristics of the Tele-ECG project representing a feasible implementation in the biomedical field. Acknowledgment: This paper was supported by the project 'Development and integration of a mobile tele-electrocardiograph in the GreenCARDIO© system for patients monitoring and diagnosis - m-GreenCARDIO', Contract no. BG58/30.09.2016, PNCDI III, Bridge Grant 2016, using the infrastructure from the project 'Integrated Center for research, development and innovation in Advanced Materials, Nanotechnologies, and Distributed Systems for fabrication and control', Contract No. 671/09.04.2015, Sectoral Operational Program for Increase of the Economic Competitiveness co-funded from the European Regional Development Fund.Keywords: Tele-ECG, real-time cardiac monitoring, electrocardiogram, microcontroller
Procedia PDF Downloads 272354 Task Scheduling and Resource Allocation in Cloud-based on AHP Method
Authors: Zahra Ahmadi, Fazlollah Adibnia
Abstract:
Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow
Procedia PDF Downloads 146353 The Documentation of Modernisation Processes in Spain Based on the Residential Architecture of the 1960s. A Patrimonial Perspective on El Plantinar Neighbourhood in Seville
Authors: Julia Rey-Pérez, Julia Díaz Borrego
Abstract:
The modernisation process of the city of Sevilla in Spain and the transformation of the city took place through national and local government initiatives from the 1960s onwards. Part of these actions was the execution of numerous residential neighbourhoodsthat prepared Sevilla for the change of era. This process was possible thanks to the implementation of public policies that showed the imminent need for new architectural programmes, as well as for high-rise architecture built in reinforced concrete. However, very little is known to this day about the modernisation process in Sevilla and the development of these neighbourhoods, which were designed to house a large number of people and are today a key reference point in the Historic Urban Landscape of the city of Seville. Therefore, the present research aims to learn and reflect upon the urban transformation of the city at this time andto deepen the heritage uniqueness of these neighbourhoods, as is the case of ElPlantinarneighbourhood.The methodology proposed for this research is structured in three phases, where in the first stage, a general study of the El Plantinarneighbourhood was carried out on three scales: urban, object-typological and perceptive. In the second stage, the cultural attributes and values of the urban complex in question were identified in order to determine whether the case study is truly representative of the beginnings of modernity in Spain and whether it needs a heritage approach. Finally, a third phase is proposed in which criteria will be defined on how to intervene in this neighbourhood to guarantee its presence in the urban landscape of the city of Seville. The expected results will help to understand the process of modernisation that the city has undergone, as well as the heritage value of this architecture in the construction of the collective memory.Keywords: modern heritage, urban obsolescence, methodology, develop
Procedia PDF Downloads 150352 Examining the Effects of Increasing Lexical Retrieval Attempts in Tablet-Based Naming Therapy for Aphasia
Authors: Jeanne Gallee, Sofia Vallila-Rohter
Abstract:
Technology-based applications are increasingly being utilized in aphasia rehabilitation as a means of increasing intensity of treatment and improving accessibility to treatment. These interactive therapies, often available on tablets, lead individuals to complete language and cognitive rehabilitation tasks that draw upon skills such as the ability to name items, recognize semantic features, count syllables, rhyme, and categorize objects. Tasks involve visual and auditory stimulus cues and provide feedback about the accuracy of a person’s response. Research has begun to examine the efficacy of tablet-based therapies for aphasia, yet much remains unknown about how individuals interact with these therapy applications. Thus, the current study aims to examine the efficacy of a tablet-based therapy program for anomia, further examining how strategy training might influence the way that individuals with aphasia engage with and benefit from therapy. Individuals with aphasia are enrolled in one of two treatment paradigms: traditional therapy or strategy therapy. For ten weeks, all participants receive 2 hours of weekly in-house therapy using Constant Therapy, a tablet-based therapy application. Participants are provided with iPads and are additionally encouraged to work on therapy tasks for one hour a day at home (home logins). For those enrolled in traditional therapy, in-house sessions involve completing therapy tasks while a clinician researcher is present. For those enrolled in the strategy training group, in-house sessions focus on limiting cue use in order to maximize lexical retrieval attempts and naming opportunities. The strategy paradigm is based on the principle that retrieval attempts may foster long-term naming gains. Data have been collected from 7 participants with aphasia (3 in the traditional therapy group, 4 in the strategy training group). We examine cue use, latency of responses and accuracy through the course of therapy, comparing results across group and setting (in-house sessions vs. home logins).Keywords: aphasia, speech-language pathology, traumatic brain injury, language
Procedia PDF Downloads 204351 Effectiveness of Visual Auditory Kinesthetic Tactile Technique on Reading Level among Dyslexic Children in Helikx Open School and Learning Centre, Salem
Authors: J. Mano Ranjini
Abstract:
Each and every child is special, born with a unique talent to explore this world. The word Dyslexia is derived from the Greek language in which “dys” meaning poor or inadequate and “lexis” meaning words or language. Dyslexia describes about a different kind of mind, which is often gifted and productive, that learns the concept differently. The main aim of the study is to bring the positive outcome of the reading level by examining the effectiveness of Visual Auditory Kinesthetic Tactile technique on Reading Level among Dyslexic Children at Helikx Open School and Learning Centre. A Quasi experimental one group pretest post test design was adopted for this study. The Reading Level was assessed by using the Schonell Graded Word Reading Test. Thirty subjects were drawn by using purposive sampling technique and the intervention Visual Auditory Kinesthetic Tactile technique was implemented to the Dyslexic Children for 30 consecutive days followed by the post Reading Level assessment revealed the improvement in the mean score value of reading level by 12%. Multi-sensory (VAKT) teaching uses all learning pathways in the brain (visual, auditory, kinesthetic-tactile) in order to enhance memory and learning and the ability in uplifting emotional, physical and societal dimensions. VAKT is an effective method to improve the reading skill of the Dyslexic Children that ensures the enormous significance of learning thereby influencing the wholesome of the child’s life.Keywords: visual auditory kinesthetic tactile technique, reading level, dyslexic children, Helikx Open School
Procedia PDF Downloads 601350 Learning Physics Concepts through Language Syntagmatic Paradigmatic Relations
Authors: C. E. Laburu, M. A. Barros, A. F. Zompero, O. H. M. Silva
Abstract:
The work presents a teaching strategy that employs syntagmatic and paradigmatic linguistic relations in order to monitor the understanding of physics students’ concepts. Syntagmatic and paradigmatic relations are theoretical elements of semiotics studies and our research circumstances and justified them within the research program of multi-modal representations. Among the multi-modal representations to learning scientific knowledge, the scope of action of syntagmatic and paradigmatic relations belongs to the discursive writing form. The use of such relations has the purpose to seek innovate didactic work with discourse representation in the write form before translate to another different representational form. The research was conducted with a sample of first year high school students. The students were asked to produce syntagmatic and paradigmatic of Newton’ first law statement. This statement was delivered in paper for each student that should individually write the relations. The student’s records were collected for analysis. It was possible observed in one student used here as example that their monemes replaced and rearrangements produced by, respectively, syntagmatic and paradigmatic relations, kept the original meaning of the law. In paradigmatic production he specified relevant significant units of the linguistic signs, the monemas, which constitute the first articulation and each word substituted kept equivalence to the original meaning of original monema. Also, it was noted a number of diverse and many monemas were chosen, with balanced combination of grammatical (grammatical monema is what changes the meaning of a word, in certain positions of the syntagma, along with a relatively small number of other monemes. It is the smallest linguistic unit that has grammatical meaning) and lexical (lexical monema is what belongs to unlimited inventories; is the monema endowed with lexical meaning) monemas. In syntagmatic production, monemas ordinations were syntactically coherent, being linked with semantic conservation and preserved number. In general, the results showed that the written representation mode based on linguistic relations paradigmatic and syntagmatic qualifies itself to be used in the classroom as a potential identifier and accompanist of meanings acquired from students in the process of scientific inquiry.Keywords: semiotics, language, high school, physics teaching
Procedia PDF Downloads 132349 Network Pharmacological Evaluation of Holy Basil Bioactive Phytochemicals for Identifying Novel Potential Inhibitors Against Neurodegenerative Disorder
Authors: Bhuvanesh Baniya
Abstract:
Alzheimer disease is illnesses that are responsible for neuronal cell death and resulting in lifelong cognitive problems. Due to their unclear mechanism, there are no effective drugs available for the treatment. For a long time, herbal drugs have been used as a role model in the field of the drug discovery process. Holy basil in the Indian medicinal system (Ayurveda) is used for several neuronal disorders like insomnia and memory loss for decades. This study aims to identify active components of holy basil as potential inhibitors for the treatment of Alzheimer disease. To fulfill this objective, the Network pharmacology approach, gene ontology, pharmacokinetics analysis, molecular docking, and molecular dynamics simulation (MDS) studies were performed. A total of 7 active components in holy basil, 12 predicted neurodegenerative targets of holy basil, and 8063 Alzheimer-related targets were identified from different databases. The network analysis showed that the top ten targets APP, EGFR, MAPK1, ESR1, HSPA4, PRKCD, MAPK3, ABL1, JUN, and GSK3B were found as significant target related to Alzheimer disease. On the basis of gene ontology and topology analysis results, APP was found as a significant target related to Alzheimer’s disease pathways. Further, the molecular docking results to found that various compounds showed the best binding affinities. Further, MDS top results suggested could be used as potential inhibitors against APP protein and could be useful for the treatment of Alzheimer’s disease.Keywords: holy basil, network pharmacology, neurodegeneration, active phytochemicals, molecular docking and simulation
Procedia PDF Downloads 102348 Anton Bruckner’s Requiem in Dm: The Reinterpretation of a Liturgical Genre in the Viennese Romantic Context
Authors: Sara Ramos Contioso
Abstract:
The premiere of Anton Bruckner's Requiem in Dm, in September 1849, represents a turning point in the composer's creative evolution. This Mass of the Dead, which was dedicated to the memory of his esteemed friend and mentor Franz Sailer, establishes the beginning of a new creative aesthetic in the composer´s production and links its liturgical development, which is contextualized in the monastery of St. Florian, to the use of a range of musicals possibilities that are projected by Bruckner on an orchestral texture with choir and organ. Set on a strict tridentine ritual model, this requiem exemplifies the religious aesthetics of a composer that is committed to the Catholic faith and that also links to its structure the reinterpretation of a religious model that, despite being romantic, shows a strong influence derived from the baroque or the Viennese Classicism language. Consequently, the study responds to the need to show the survival of the Requiem Mass within the romantic context of Vienna. Therefore, it draws on a detailed analysis of the score and the creative context of the composer with the intention of linking the work to the tradition of the genre and also specifying the stylistic particularities of its musical model within a variability of possibilities such as the contrasting precedents of Mozart, Haydn, Cherubini or Berlioz´s requiems. Tradition or modernity, liturgy or concert hall are aesthetic references that will condition the development of the Requiem Mass in the middle of the nineteenth century. In this context, this paper tries to recover Bruckner's Requiem in Dm as a musical model of the romantic ritual of deceased and as a stylistic reference of a creative composition that will condition the development of later liturgical works such as Liszt or DeLange (1868) ones.Keywords: liturgy, religious symbolism, requiem, romanticism
Procedia PDF Downloads 339347 Using the Minnesota Multiphasic Personality Inventory-2 and Mini Mental State Examination-2 in Cognitive Behavioral Therapy: Case Studies
Authors: Cornelia-Eugenia Munteanu
Abstract:
From a psychological perspective, psychopathology is the area of clinical psychology that has at its core psychological assessment and psychotherapy. In day-to-day clinical practice, psychodiagnosis and psychotherapy are used independently, according to their intended purpose and their specific methods of application. The paper explores how the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and Mini Mental State Examination-2 (MMSE-2) psychological tools contribute to enhancing the effectiveness of cognitive behavioral psychotherapy (CBT). This combined approach, psychotherapy in conjunction with assessment of personality and cognitive functions, is illustrated by two cases, a severe depressive episode with psychotic symptoms and a mixed anxiety-depressive disorder. The order in which CBT, MMPI-2, and MMSE-2 were used in the diagnostic and therapeutic process was determined by the particularities of each case. In the first case, the sequence started with psychotherapy, followed by the administration of blue form MMSE-2, MMPI-2, and red form MMSE-2. In the second case, the cognitive screening with blue form MMSE-2 led to a personality assessment using MMPI-2, followed by red form MMSE-2; reapplication of the MMPI-2 due to the invalidation of the first profile, and finally, psychotherapy. The MMPI-2 protocols gathered useful information that directed the steps of therapeutic intervention: a detailed symptom picture of potentially self-destructive thoughts and behaviors otherwise undetected during the interview. The memory loss and poor concentration were confirmed by MMSE-2 cognitive screening. This combined approach, psychotherapy with psychological assessment, aligns with the trend of adaptation of the psychological services to the everyday life of contemporary man and paves the way for deepening and developing the field.Keywords: assessment, cognitive behavioral psychotherapy, MMPI-2, MMSE-2, psychopathology
Procedia PDF Downloads 327346 Cultural Works Interacting with the Generational Aesthetic Gap between Gen X and Gen Z in China: A Qualitative Study
Authors: Qianyu Zhang
Abstract:
The spread of digital technology in China has worsened the generation gap and intergenerational competition for cultural and aesthetic discourse. Meanwhile, the increased accessibility of cultural works has encouraged the sharing and inheritance of collective cultural memories between generations. However, not each cultural work can engage positively with efforts to bridge intergenerational aesthetic differences. This study argues that in contemporary China, where new media and the Internet are widely available, featured cultural works have more potential to help enhance the cultural aesthetic consensus among different generations, thus becoming an effective countermeasure to narrow the intergenerational aesthetic rift and cultural discontinuity. Specifically, the generational aesthetic gap is expected to be bridged or improved through the shared appreciation or consumption of cultural works that meet certain conditions by several generations. In-depth interviews of Gen X and Gen Z (N=15, respectively) in China uncovered their preferences and commonalities for cultural works and shared experiences in appreciating them. Results demonstrate that both generations’ shared appreciation of cultural work is a necessary but insufficient condition for its effective response to the generational aesthetic gap. Coding analysis rendered six dimensions that cultural works with the potential to bridge the intergenerational aesthetic divide should satisfy simultaneously: genre, theme, content, elements, quality, and accessibility. Cultural works that engage multiple senses/ compound realistic, domestic and contemporary cultural memories/ contain the narrative of family life and nationalism/ include more elements familiar to the previous generation/ are superb-produced and unaffected/ are more accessible better promote intergenerational aesthetic exchange and value recognition. Moreover, compared to the dilemma of the previous generation facing the aesthetic gap, the later generation plays a crucial role in bridging the generational aesthetic divide.Keywords: cultural works, generation gap, generation X, generation Z, cultural memory
Procedia PDF Downloads 155345 Designing Garments Ergonomically to Improve Life Quality of Elderly People
Authors: Nagda Ibrahim Mady, Shimaa Mohamed Atiha
Abstract:
In light of actual needs of elderly people and the changes that accompany age in eyesight, hearing, dexterity, mobility, and memory which make aged people unable to carry out the simplest living affairs especially clothing demands. These needs are almost neglected in the current clothing market obligate aged peoples to wear the available choices without any consideration to their actual desires and needs. Fashion designer has gained many experiences that can gather between ergonomics and stages of fashion designing process. Fashion designer can determine the actual needs of aged people and reply these needs with designs that can achieve Improvement to the life quality of aged people besides maintaining good appearance. Thus Fashion designer can help elderly people to avoid negative impacts age leaves on them, either it is psychological or kinetic or that of dementia. Ergonomics in clothing is considered the tools and mechanisms that are used to fit aged people satisfactions supporting them to improve their living using the least time and effort. Providing the elderly with comfort besides maintaining good appearance that can make self–confidence besides independence. From this point of view the research is looking forward to improve the life of aged people through addressing functional clothes that can make elderly independent in the wearing process. Providing in these designs comfort, quality, and practicality and economic cost. Suggesting the suitable fabrics and materials and applying it to the designs to help the elderly perform their daily living customs. Reaching the successful designs that can be acceptable to specialists and to consumers whom they confirm: it supplies their clothing needs and provides the atheistic and functional performance and therefore it gives them better life.Keywords: ergonomic, design garments, elderly people, life quality
Procedia PDF Downloads 568344 Designing Space through Narratives: The Role of the Tour Description in the Architectural Design Process
Authors: A. Papadopoulou
Abstract:
When people are asked to provide an oral description of a space they usually provide a Tour description, which is a dynamic type of spatial narrative centered on the narrator’s body, rather than a Map description, which is a static type of spatial narrative focused on the organization of the space as seen from above. Also, subjects with training in the architecture discipline tend to adopt a Tour perspective of space when the narrative refers to a space they have actually experienced but tend to adopt a Map perspective when the narrative refers to a space they have merely imagined. This pilot study aims to investigate whether the Tour description, which is the most common mode in the oral descriptions of experienced space, is a cognitive perspective taken in the process of designing a space. The study investigates whether a spatial description provided by a subject with architecture training in the type of a Tour description would be accurately translated into a spatial layout by other subjects with architecture training. The subjects were given the Tour description in written form and were asked to make a plan drawing of the described space. The results demonstrate that when we conceive and design space we do not adopt the same rules and cognitive patterns that we adopt when we reconstruct space from our memory. As shown by the results of this pilot study, the rules that underlie the Tour description were not detected in the translation from narratives to drawings. In a different phase, the study also investigates how would subjects with architecture training describe space when forced to take a Tour perspective in their oral description of a space. The results of this second phase demonstrate that if intentionally taken, the Tour perspective leads to descriptions of space that are more detailed and focused on experiential aspects.Keywords: architecture, design process, embodied cognition, map description, oral narratives, tour description
Procedia PDF Downloads 159343 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning
Authors: Joseph George, Anne Kotteswara Roa
Abstract:
Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.Keywords: skin cancer, deep learning, performance measures, accuracy, datasets
Procedia PDF Downloads 132342 Memorizing Music and Learning Strategies
Authors: Elisabeth Eder
Abstract:
Memorizing music plays an important role for instrumentalists and has been researched very little so far. Almost every musician is confronted with memorizing music in the course of their musical career. For numerous competitions, examinations (e.g., at universities, music schools), solo performances, and the like, memorization is a requirement. Learners are often required to learn a piece by heart but are rarely given guidance on how to proceed. This was also confirmed by Eder's preliminary study to examine the topicality and relevance of the topic, in which 111 instrumentalists took part. The preliminary study revealed a great desire for more knowledge or information about learning strategies as well as a greater sense of security when performing by heart on stage through the use of learning strategies by those musicians who use learning strategies. Eder’s research focuses on learning strategies for memorizing music. As part of a large-scale empirical study – an online questionnaire translated into 10 languages was used to conduct the study – 1091 musicians from 64 different countries described how they memorize. The participants in the study also evaluated their learning strategies and justified their choice in terms of their degree of effectiveness. Based on the study and pedagogical literature, 100 learning strategies were identified and categorized; the strategies were examined with regard to their effectiveness, and instrument-specific, age-specific, country-specific, gender-specific, and education-related differences and similarities concerning the choice of learning strategies were investigated. Her research also deals with forms and models of memory and how music-related information can be stored and retrieved and also forgotten again. A further part is devoted to the possibilities that teachers and learners have to support the process of memorization independently of learning strategies. The findings resulting from Elisabeth Eder's research should enable musicians and instrumental students to memorize faster and more confidently.Keywords: memorizing music, learning strategies, empirical study, effectiveness of strategies
Procedia PDF Downloads 42341 Cognitive Impairment in Chronic Renal Patients on Hemodialysis
Authors: Fabiana Souza Orlandi, Juliana Gomes Duarte, Gabriela Dutra Gesualdo
Abstract:
Chronic renal disease (CKD), accompanied by hemodialysis, causes chronic renal failure in a number of situations that compromises not only physical, personal and environmental aspects, but also psychological, social and family aspects. Objective: To verify the level of cognitive impairment of chronic renal patients on hemodialysis. Methodology: This is a descriptive, cross-sectional study. The present study was performed in a Dialysis Center of a city in the interior of the State of São Paulo. The inclusion criteria were: being 18 years or older; have a medical diagnosis of CKD; being in hemodialysis treatment in this unit; and agree to participate in the research, with the signature of the Informed Consent (TCLE). A total of 115 participants were evaluated through the Participant Characterization Instrument and the Addenbrooke Cognitive Exam - Revised Version (ACE-R), being scored from 0 to 100, stipulating the cut-off note for the complete battery <78 and subdivided into five domains: attention and guidance; memory; fluency; language; (66.9%) and caucasian (54.7%), 53.7 (±14.8) years old. Most of the participants were retired (74.7%), with incomplete elementary schooling (36.5%) and the average time of treatment was 46 months. Most of the participants (61.3%) presented impairment in the area of attention and orientation, 80.4% in the spatial visual domain. Regarding the total ACE-R score, 75.7% of the participants presented scores below the established cut grade. Conclusion: There was a high percentage (75.7%) below the cut-off score established for ACE-R, suggesting that there may be some cognitive impairment among these participants, since the instrument only performs a screening on cognitive health. The results of the study are extremely important so that possible interventions can be traced in order to minimize impairment, thus improving the quality of life of chronic renal patients.Keywords: cognition, chronic renal insufficiency, adult health, dialysis
Procedia PDF Downloads 366340 Enhancing Cultural Heritage Data Retrieval by Mapping COURAGE to CIDOC Conceptual Reference Model
Authors: Ghazal Faraj, Andras Micsik
Abstract:
The CIDOC Conceptual Reference Model (CRM) is an extensible ontology that provides integrated access to heterogeneous and digital datasets. The CIDOC-CRM offers a “semantic glue” intended to promote accessibility to several diverse and dispersed sources of cultural heritage data. That is achieved by providing a formal structure for the implicit and explicit concepts and their relationships in the cultural heritage field. The COURAGE (“Cultural Opposition – Understanding the CultuRal HeritAGE of Dissent in the Former Socialist Countries”) project aimed to explore methods about socialist-era cultural resistance during 1950-1990 and planned to serve as a basis for further narratives and digital humanities (DH) research. This project highlights the diversity of flourished alternative cultural scenes in Eastern Europe before 1989. Moreover, the dataset of COURAGE is an online RDF-based registry that consists of historical people, organizations, collections, and featured items. For increasing the inter-links between different datasets and retrieving more relevant data from various data silos, a shared federated ontology for reconciled data is needed. As a first step towards these goals, a full understanding of the CIDOC CRM ontology (target ontology), as well as the COURAGE dataset, was required to start the work. Subsequently, the queries toward the ontology were determined, and a table of equivalent properties from COURAGE and CIDOC CRM was created. The structural diagrams that clarify the mapping process and construct queries are on progress to map person, organization, and collection entities to the ontology. Through mapping the COURAGE dataset to CIDOC-CRM ontology, the dataset will have a common ontological foundation with several other datasets. Therefore, the expected results are: 1) retrieving more detailed data about existing entities, 2) retrieving new entities’ data, 3) aligning COURAGE dataset to a standard vocabulary, 4) running distributed SPARQL queries over several CIDOC-CRM datasets and testing the potentials of distributed query answering using SPARQL. The next plan is to map CIDOC-CRM to other upper-level ontologies or large datasets (e.g., DBpedia, Wikidata), and address similar questions on a wide variety of knowledge bases.Keywords: CIDOC CRM, cultural heritage data, COURAGE dataset, ontology alignment
Procedia PDF Downloads 148339 Indoleamines (Serotonin & Melatonin) in Edible Plants: Its Influence on Human Health
Authors: G. A. Ravishankar, A. Ramakrishna
Abstract:
Melatonin (MEL) and Serotonin (SER), also known as [5-Hydroxytryptamine (5-HT)] are reported to be in a range of plant types which are edible. Their occurrence in plants species appears to be ubiquitous. Their presence in high quantities in plants assumes significance owing to their physiological effects upon consumption by human beings. MEL is a well known animal hormone mainly released by the pineal gland known to influence circadian rhythm, sleep, apart from immune enhancement. Similarly, SER is a neurotransmitter that regulates mood, sleep and anxiety in mammals. It is implicated in memory, behavioral changes, scavenging reactive oxygen species, antipsychotic, etc. Similarly Role of SER and MEL in plant morphogenesis, and various physiological processes through intense research is beginning to unfold. These molecules are in common foods viz banana, pineapple, plum, nuts, milk, grape wine. N- Feruloyl serotonin and p-coumaroyl serotonin found in certain seeds are found to possess antioxidant, anti-inflammatory, antitumor, antibacterial, and anti-stress potential apart from reducing depression and anxiety. MEL is found in Mediterranean diets, nuts, cherries, tomato berries, and olive products. Consumption of foods rich in MEL is known to increase blood MEL levels which have been implicated in protective effect against cardiovascular damage, cancer initiation and growth. MEL is also found in wines, green tea, beer, olive oil etc. Moreover, presence of SER and MEL in Coffee beans (green and roasted beans) and decoction has been reported us. In this communication we report the occurrence of indole amines in edible plants and their implications in human health.Keywords: serotonin, melatonin, edible plants, neurotransmitters, physiological effects
Procedia PDF Downloads 280338 Meaning Interpretation of Persian Noun-Noun Compounds: A Conceptual Blending Approach
Authors: Bahareh Yousefian, Laurel Smith Stvan
Abstract:
Linguistic structures have two facades: form and meaning. These structures could have either literal meaning or figurative meaning (although it could also depend on the context in which that structure appears). The literal meaning is understandable more easily, but for the figurative meaning, a word or concept is understood from a different word or concept. In linguistic structures with a figurative meaning, it’s more difficult to relate their forms to the meanings than structures with literal meaning. In these cases, the relationship between form and figurative meaning could be studied from different perspectives. Various linguists have been curious about what happens in someone’s mind to understand figurative meaning through the forms; they have used different perspectives and theories to explain this process. It has been studied through cognitive linguistics as well, in which mind and mental activities are really important. In this viewpoint, meaning (in other words, conceptualization) is considered a mental process. In this descriptive-analytic study, 20 Persian compound nouns with figurative meanings have been collected from the Persian-language Moeen Encyclopedic Dictionary and other sources. Examples include [“Sofreh Xaneh”] (traditional restaurant) and [“Dast Yar”] (Assistant). These were studied in a cognitive semantics framework using “Conceptual Blending Theory” which hasn’t been tested on Persian compound nouns before. It was noted that “Conceptual Blending Theory” could lead to the process of understanding the figurative meanings of Persian compound nouns. Many cognitive linguists believe that “Conceptual Blending” is not only a linguistic theory but it’s also a basic human cognitive ability that plays important roles in thought, imagination, and even everyday life as well (though unconsciously). The ability to use mental spaces and conceptual blending (which is exclusive to humankind) is such a basic but unconscious ability that we are unaware of its existence and importance. What differentiates Conceptual Blending Theory from other ways of understanding figurative meaning, are arising new semantic aspects (emergent structure) that lead to a more comprehensive and precise meaning. In this study, it was found that Conceptual Blending Theory could explain reaching the figurative meanings of Persian compound nouns from their forms, such as [talkative for compound word of “Bolbol + Zabani” (nightingale + tongue)] and [wage for compound word of “Dast + Ranj” (hand + suffering)].Keywords: cognitive linguistics, conceptual blending, figurative meaning, Persian compound nouns
Procedia PDF Downloads 79337 The Challenges of Cloud Computing Adoption in Nigeria
Authors: Chapman Eze Nnadozie
Abstract:
Cloud computing, a technology that is made possible through virtualization within networks represents a shift from the traditional ownership of infrastructure and other resources by distinct organization to a more scalable pattern in which computer resources are rented online to organizations on either as a pay-as-you-use basis or by subscription. In other words, cloud computing entails the renting of computing resources (such as storage space, memory, servers, applications, networks, etc.) by a third party to its clients on a pay-as-go basis. It is a new innovative technology that is globally embraced because of its renowned benefits, profound of which is its cost effectiveness on the part of organizations engaged with its services. In Nigeria, the services are provided either directly to companies mostly by the key IT players such as Microsoft, IBM, and Google; or in partnership with some other players such as Infoware, Descasio, and Sunnet. This action enables organizations to rent IT resources on a pay-as-you-go basis thereby salvaging them from wastages accruable on acquisition and maintenance of IT resources such as ownership of a separate data centre. This paper intends to appraise the challenges of cloud computing adoption in Nigeria, bearing in mind the country’s peculiarities’ in terms of infrastructural development. The methodologies used in this paper include the use of research questionnaires, formulated hypothesis, and the testing of the formulated hypothesis. The major findings of this paper include the fact that there are some addressable challenges to the adoption of cloud computing in Nigeria. Furthermore, the country will gain significantly if the challenges especially in the area of infrastructural development are well addressed. This is because the research established the fact that there are significant gains derivable by the adoption of cloud computing by organizations in Nigeria. However, these challenges can be overturned by concerted efforts in the part of government and other stakeholders.Keywords: cloud computing, data centre, infrastructure, it resources, virtualization
Procedia PDF Downloads 354336 The KAPSARC Energy Policy Database: Introducing a Quantified Library of China's Energy Policies
Authors: Philipp Galkin
Abstract:
Government policy is a critical factor in the understanding of energy markets. Regardless, it is rarely approached systematically from a research perspective. Gaining a precise understanding of what policies exist, their intended outcomes, geographical extent, duration, evolution, etc. would enable the research community to answer a variety of questions that, for now, are either oversimplified or ignored. Policy, on its surface, also seems a rather unstructured and qualitative undertaking. There may be quantitative components, but incorporating the concept of policy analysis into quantitative analysis remains a challenge. The KAPSARC Energy Policy Database (KEPD) is intended to address these two energy policy research limitations. Our approach is to represent policies within a quantitative library of the specific policy measures contained within a set of legal documents. Each of these measures is recorded into the database as a single entry characterized by a set of qualitative and quantitative attributes. Initially, we have focused on the major laws at the national level that regulate coal in China. However, KAPSARC is engaged in various efforts to apply this methodology to other energy policy domains. To ensure scalability and sustainability of our project, we are exploring semantic processing using automated computer algorithms. Automated coding can provide a more convenient input data for human coders and serve as a quality control option. Our initial findings suggest that the methodology utilized in KEPD could be applied to any set of energy policies. It also provides a convenient tool to facilitate understanding in the energy policy realm enabling the researcher to quickly identify, summarize, and digest policy documents and specific policy measures. The KEPD captures a wide range of information about each individual policy contained within a single policy document. This enables a variety of analyses, such as structural comparison of policy documents, tracing policy evolution, stakeholder analysis, and exploring interdependencies of policies and their attributes with exogenous datasets using statistical tools. The usability and broad range of research implications suggest a need for the continued expansion of the KEPD to encompass a larger scope of policy documents across geographies and energy sectors.Keywords: China, energy policy, policy analysis, policy database
Procedia PDF Downloads 323335 A Phenomenological Approach to Computational Modeling of Analogy
Authors: José Eduardo García-Mendiola
Abstract:
In this work, a phenomenological approach to computational modeling of analogy processing is carried out. The paper goes through the consideration of the structure of the analogy, based on the possibility of sustaining the genesis of its elements regarding Husserl's genetic theory of association. Among particular processes which take place in order to get analogical inferences, there is one which arises crucial for enabling efficient base cases retrieval through long-term memory, namely analogical transference grounded on familiarity. In general, it has been argued that analogical reasoning is a way by which a conscious agent tries to determine or define a certain scope of objects and relationships between them using previous knowledge of other familiar domain of objects and relations. However, looking for a complete description of analogy process, a deeper consideration of phenomenological nature is required in so far, its simulation by computational programs is aimed. Also, one would get an idea of how complex it would be to have a fully computational account of the analogy elements. In fact, familiarity is not a result of a mere chain of repetitions of objects or events but generated insofar as the object/attribute or event in question is integrable inside a certain context that is taking shape as functionalities and functional approaches or perspectives of the object are being defined. Its familiarity is generated not by the identification of its parts or objective determinations as if they were isolated from those functionalities and approaches. Rather, at the core of such a familiarity between entities of different kinds lays the way they are functionally encoded. So, and hoping to make deeper inroads towards these topics, this essay allows us to consider that cognitive-computational perspectives can visualize, from the phenomenological projection of the analogy process reviewing achievements already obtained as well as exploration of new theoretical-experimental configurations towards implementation of analogy models in specific as well as in general purpose machines.Keywords: analogy, association, encoding, retrieval
Procedia PDF Downloads 123334 Cognitive Rehabilitation in Schizophrenia: A Review of the Indian Scenario
Authors: Garima Joshi, Pratap Sharan, V. Sreenivas, Nand Kumar, Kameshwar Prasad, Ashima N. Wadhawan
Abstract:
Schizophrenia is a debilitating disorder and is marked by cognitive impairment, which deleteriously impacts the social and professional functioning along with the quality of life of the patients and the caregivers. Often the cognitive symptoms are in their prodromal state and worsen as the illness progresses; they have proven to have a good predictive value for the prognosis of the illness. It has been shown that intensive cognitive rehabilitation (CR) leads to improvements in the healthy as well as cognitively-impaired subjects. As the majority of population in India falls in the lower to middle socio-economic status and have low education levels, using the existing packages, a majority of which are developed in the West, for cognitive rehabilitation becomes difficult. The use of technology is also restricted due to the high costs involved and the limited availability and familiarity with computers and other devices, which pose as an impedance for continued therapy. Cognitive rehabilitation in India uses a plethora of retraining methods for the patients with schizophrenia targeting the functions of attention, information processing, executive functions, learning and memory, and comprehension along with Social Cognition. Psychologists often have to follow an integrative therapy approach involving social skills training, family therapy and psychoeducation in order to maintain the gains from the cognitive rehabilitation in the long run. This paper reviews the methodologies and cognitive retaining programs used in India. It attempts to elucidate the evolution and development of methodologies used, from traditional paper-pencil based retraining to more sophisticated neuroscience-informed techniques in cognitive rehabilitation of deficits in schizophrenia as home-based or supervised and guided programs for cognitive rehabilitation.Keywords: schizophrenia, cognitive rehabilitation, neuropsychological interventions, integrated approached to rehabilitation
Procedia PDF Downloads 363333 Looking beyond Lynch's Image of a City
Authors: Sandhya Rao
Abstract:
Kevin Lynch’s Theory on Imeageability, let on explore a city in terms of five elements, Nodes, Paths, Edges, landmarks and Districts. What happens when we try to record the same data in an Indian context? What happens when we apply the same theory of Imageability to a complex shifting urban pattern of the Indian cities and how can we as Urban Designers demonstrate our role in the image building ordeal of these cities? The organizational patterns formed through mental images, of an Indian city is often diverse and intangible. It is also multi layered and temporary in terms of the spirit of the place. The pattern of images formed is loaded with associative meaning and intrinsically linked with the history and socio-cultural dominance of the place. The embedded memory of a place in one’s mind often plays an even more important role while formulating these images. Thus while deriving an image of a city one is often confused or finds the result chaotic. The images formed due to its complexity are further difficult to represent using a single medium. Under such a scenario it’s difficult to derive an output of an image constructed as well as make design interventions to enhance the legibility of a place. However, there can be a combination of tools and methods that allows one to record the key elements of a place through time, space and one’s user interface with the place. There has to be a clear understanding of the participant groups of a place and their time and period of engagement with the place as well. How we can translate the result obtained into a design intervention at the end, is the main of the research. Could a multi-faceted cognitive mapping be an answer to this or could it be a very transient mapping method which can change over time, place and person. How does the context influence the process of image building in one’s mind? These are the key questions that this research will aim to answer.Keywords: imageability, organizational patterns, legibility, cognitive mapping
Procedia PDF Downloads 314332 Enhancing Project Performance Forecasting using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
Accurate forecasting of project performance metrics is crucial for successfully managing and delivering urban road reconstruction projects. Traditional methods often rely on static baseline plans and fail to consider the dynamic nature of project progress and external factors. This research proposes a machine learning-based approach to forecast project performance metrics, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category in an urban road reconstruction project. The proposed model utilizes time series forecasting techniques, including Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance based on historical data and project progress. The model also incorporates external factors, such as weather patterns and resource availability, as features to enhance the accuracy of forecasts. By applying the predictive power of machine learning, the performance forecasting model enables proactive identification of potential deviations from the baseline plan, which allows project managers to take timely corrective actions. The research aims to validate the effectiveness of the proposed approach using a case study of an urban road reconstruction project, comparing the model's forecasts with actual project performance data. The findings of this research contribute to the advancement of project management practices in the construction industry, offering a data-driven solution for improving project performance monitoring and control.Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, earned value management
Procedia PDF Downloads 50331 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options
Authors: Wajih Abbassi, Zouhaier Ben Khelifa
Abstract:
The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options
Procedia PDF Downloads 429330 Measuring the Resilience of e-Governments Using an Ontology
Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips
Abstract:
The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats
Procedia PDF Downloads 338