Search results for: logical schema
414 Performance-Based Quality Evaluation of Database Conceptual Schemas
Authors: Janusz Getta, Zhaoxi Pan
Abstract:
Performance-based quality evaluation of database conceptual schemas is an important aspect of database design process. It is evident that different conceptual schemas provide different logical schemas and performance of user applications strongly depends on logical and physical database structures. This work presents the entire process of performance-based quality evaluation of conceptual schemas. First, we show format. Then, the paper proposes a new specification of object algebra for representation of conceptual level database applications. Transformation of conceptual schemas and expression of object algebra into implementation schema and implementation in a particular database system allows for precise estimation of the processing costs of database applications and as a consequence for precise evaluation of performance-based quality of conceptual schemas. Then we describe an experiment as a proof of concept for the evaluation procedure presented in the paper.Keywords: conceptual schema, implementation schema, logical schema, object algebra, performance evaluation, query processing
Procedia PDF Downloads 292413 Metric Suite for Schema Evolution of a Relational Database
Authors: S. Ravichandra, D. V. L. N. Somayajulu
Abstract:
Requirement of stakeholders for adding more details to the database is the main cause of the schema evolution in the relational database. Further, this schema evolution causes the instability to the database. Hence, it is aimed to define a metric suite for schema evolution of a relational database. The metric suite will calculate the metrics based on the features of the database, analyse the queries on the database and measures the coupling, cohesion and component dependencies of the schema for existing and evolved versions of the database. This metric suite will also provide an indicator for the problems related to the stability and usability of the evolved database. The degree of change in the schema of a database is presented in the forms of graphs that acts as an indicator and also provides the relations between various parameters (metrics) related to the database architecture. The acquired information is used to defend and improve the stability of database architecture. The challenges arise in incorporating these metrics with varying parameters for formulating a suitable metric suite are discussed. To validate the proposed metric suite, an experimentation has been performed on publicly available datasets.Keywords: cohesion, coupling, entropy, metric suite, schema evolution
Procedia PDF Downloads 451412 An Optimization Algorithm Based on Dynamic Schema with Dissimilarities and Similarities of Chromosomes
Authors: Radhwan Yousif Sedik Al-Jawadi
Abstract:
Optimization is necessary for finding appropriate solutions to a range of real-life problems. In particular, genetic (or more generally, evolutionary) algorithms have proved very useful in solving many problems for which analytical solutions are not available. In this paper, we present an optimization algorithm called Dynamic Schema with Dissimilarity and Similarity of Chromosomes (DSDSC) which is a variant of the classical genetic algorithm. This approach constructs new chromosomes from a schema and pairs of existing ones by exploring their dissimilarities and similarities. To show the effectiveness of the algorithm, it is tested and compared with the classical GA, on 15 two-dimensional optimization problems taken from literature. We have found that, in most cases, our method is better than the classical genetic algorithm.Keywords: chromosome injection, dynamic schema, genetic algorithm, similarity and dissimilarity
Procedia PDF Downloads 346411 Open Trial of Group Schema Therapy for the Treatment of Eating Disorders
Authors: Evelyn Smith, Susan Simpson
Abstract:
Background: Eating disorder (ED) treatment is complicated by high rates of chronicity, comorbidity, complex personality traits and client dropout. Given these complexities, Schema Therapy (ST) has been identified as a suitable treatment option. The study primarily aims to evaluate the efficacy of group ST for the treatment of EDs. The study further evaluated the effectiveness of ST in reducing schemas and improving quality of life. Method: Participant suitability was ascertained using the Eating Disorder Examination. Following this, participants attended 90-minute weekly group sessions over 25 weeks. Groups consisted of six to eight participants and were facilitated by two psychologists, at least one of who is trained in ST. Measures were completed at pre, mid and post-treatment. Measures assessed ED symptoms, cognitive schemas, schema mode presentations, quality of life, self-compassion and psychological distress. Results: As predicted, measures of ED symptoms were significantly reduced following treatment. No significant changes were observed in early maladaptive schema severity; however, reductions in schema modes were observed. Participants did not report improvements in general quality of life measures following treatment, though improvement in psychological well-being was observed. Discussion: Overall, the findings from the current study support the use of group ST for the treatment of EDs. It is expected that lengthier treatment is needed for the reduction in schema severity. Given participant dropout was considerably low, this has important treatment implications for the suitability of ST for the treatment of EDs.Keywords: eating disorders, schema therapy, treatment, quality of life
Procedia PDF Downloads 79410 INCIPIT-CRIS: A Research Information System Combining Linked Data Ontologies and Persistent Identifiers
Authors: David Nogueiras Blanco, Amir Alwash, Arnaud Gaudinat, René Schneider
Abstract:
At a time when the access to and the sharing of information are crucial in the world of research, the use of technologies such as persistent identifiers (PIDs), Current Research Information Systems (CRIS), and ontologies may create platforms for information sharing if they respond to the need of disambiguation of their data by assuring interoperability inside and between other systems. INCIPIT-CRIS is a continuation of the former INCIPIT project, whose goal was to set up an infrastructure for a low-cost attribution of PIDs with high granularity based on Archival Resource Keys (ARKs). INCIPIT-CRIS can be interpreted as a logical consequence and propose a research information management system developed from scratch. The system has been created on and around the Schema.org ontology with a further articulation of the use of ARKs. It is thus built upon the infrastructure previously implemented (i.e., INCIPIT) in order to enhance the persistence of URIs. As a consequence, INCIPIT-CRIS aims to be the hinge between previously separated aspects such as CRIS, ontologies and PIDs in order to produce a powerful system allowing the resolution of disambiguation problems using a combination of an ontology such as Schema.org and unique persistent identifiers such as ARK, allowing the sharing of information through a dedicated platform, but also the interoperability of the system by representing the entirety of the data as RDF triplets. This paper aims to present the implemented solution as well as its simulation in real life. We will describe the underlying ideas and inspirations while going through the logic and the different functionalities implemented and their links with ARKs and Schema.org. Finally, we will discuss the tests performed with our project partner, the Swiss Institute of Bioinformatics (SIB), by the use of large and real-world data sets.Keywords: current research information systems, linked data, ontologies, persistent identifier, schema.org, semantic web
Procedia PDF Downloads 135409 Translation Choices of Logical Meaning from Chinese into English: A Systemic Functional Linguistics Perspective
Authors: Xueying Li
Abstract:
Different from English, it is common to observe Chinese clauses logically related in an implicit way without any conjunctions. This typological difference has posed a great challenge for Chinese-English translators, as 1) translators may interpret logical meaning in different ways when there are no conjunctions in Chinese Source Text (ST); 2) translators may have questions whether to make Chinese implicit logical meaning explicit or to remain implicit in Target Text (TT), and whether other dimensions of logical meaning (e.g., type of logical meaning) should be shifted or not. Against this background, this study examines a comprehensive arrange of Chinese-English translation choices of logical meaning to deal with this challenge in a systematic way. It compiles several ST-TT passages from a set of translation textbooks in a corpus, namely Ying Yu Bi Yi Shi Wu (Er Ji)) [Translation Practice between Chinese and English: Intermediate Level] and its supportive training book, analyzes how logical meaning in ST are translated in TT in texts across different text types with Systemic Functional Linguistics (SFL) as the theoretical framework, and finally draws a system network of translation choices of logical meaning from Chinese into English. Since translators may probably think about semantic meaning rather than lexico-grammatical resources in translation, this study goes away from traditional lexico-grammatical choices, but rather describing translation choices from the semantic level. The findings in this study can provide some help and support for translation practitioners so that they can understand that besides explicitation, there are a variety of possible linguistic choices available for making informed decisions when translating Chinese logical meaning into English.Keywords: Chinese-English translation, logical meaning, systemic functional linguistics, translation choices
Procedia PDF Downloads 180408 Studying the Schema of Afghan Immigrants about Iranians; A Case Study of Immigrants in Tehran Province
Authors: Mohammad Ayobi
Abstract:
Afghans have been immigrating to Iran for many years; The re-establishment of the Taliban in Afghanistan caused a flood of Afghan immigrants to Iran. One of the important issues related to the arrival of Afghan immigrants is the view that Afghan immigrants have toward Iranians. In this research, we seek to identify the schema of Afghan immigrants living in Iran about Iranians. A schema is a set of data or generalized knowledge that is formed in connection with a particular group or a particular person, or even a particular nationality to identify a person with pre-determined judgments about certain matters. The schemata between certain nationalities have a direct impact on the formation of interactions between them and can be effective in establishing or not establishing proper communication between the Afghan immigrant nationality and Iranians. For the scientific understanding of research, we use the theory of “schemata.” The method of this study is qualitative, and its data will be collected through semi-structured deep interviews, and data will be analyzed by thematic analysis. The expected findings in this study are that the schemata of Afghan immigrants are more negative than Iranians because Iranians are self-centered and fanatical about Afghans, and Afghans are only workers to them.Keywords: schema study, Afghan immigrants, Iranians, in-depth interview
Procedia PDF Downloads 86407 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 418406 A Cognitive Schema of Architectural Designing Activity
Authors: Abdelmalek Arrouf
Abstract:
This article sets up a cognitive schema of the architectural designing activity. It begins by outlining, theoretically, an a priori model of its general cognitive mechanisms. The obtained theoretical framework represents the designing activity as a complex system composed of three interrelated subsystems of cognitive actions: a subsystem of meaning production, one of morphology production and finally a subsystem of navigation between the two formers. A protocol analysis that uses statistical and informational tools is then used to measure the validity of the built schema. The model thus achieved shows that the designer begins by conceiving abstract meanings, which he then translates into shapes. That’s why we call it a semio-morphic model of the designing activity.Keywords: designing actions, model of the design process, morphosis, protocol analysis, semiosis
Procedia PDF Downloads 172405 Assessing Effectiveness of Schema Mode Therapy and Emotionally Focused Couples Therapy in Attachment Styles among Couples with Marital Conflict
Authors: Reza Johari Fard, Najmeh Cheraghi, Parvin Ehtesham Zadeh, Parviz Asgari
Abstract:
The aim of this study was to investigate and comparison of the effectiveness of schema mode therapy and emotionally focused couples therapy in attachment styles (secure, avoidant, and anxious) in couples with marital conflict in a quasiexperimental method in a pretest, posttest, and follow up design with a control group. The statistical population of the study included all the couples with marital conflict who visited the Mehrana counseling center in 2019 in Ahvaz, Iran 45 couples were selected by voluntary sampling method and randomly divided into two experimental groups and one control group (15 pairs in each group). The participants completed the Adult Attachment Scale (Hazan and Shaver). The experimental groups underwent schema mode therapy and emotionally focused couples therapy for 12 sessions, but the control group did not receive any intervention. The data were analyzed by the statistical analysis of repeated measures in SPSS-19 software. The results showed that both schema mode therapy and emotionally focused couples therapy are effective in increasing the secure attachment style and reducing avoidant and ambivalent attachment styles in couples with marital conflict. There was no significant difference between the schema mode therapy group and the emotionally focused couple's therapy group in attachment styles. Therefore, it is recommended that therapists and family counselors use these therapies along with other therapeutic interventions to increase secure attachment styles and reduce marital conflicts.Keywords: schema mode therapy, emotional focused couple therapy, attachment styles, marital conflict
Procedia PDF Downloads 111404 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 72403 Isotype and Logical Positivism: A Critical Understanding through Intersemiotic Translation
Authors: Satya Girish Goparaju, Sushmita Pareek
Abstract:
This paper examines two sets of pictograms published in Neurath’s books Basic by Isotype and International Pictorial Language in order to investigate the reasons for pictorial language having become an end in itself despite its potential to be relevant, especially in the 21st century digital age of heightened interlingual engagement. ISOTYPE was developed by Otto Neurath to be an ‘international language’ (pictorial) in the late 1920s. It was derived from the philosophy of logical positivism (of the Vienna Circle), which believed that language can be reduced to sets of direct experiences as bare symbols, devoid of the emotive and expressive functions. In his book International Picture Language, Neurath noted that any language is less clear-cut in one or the other way, and hence the pictorial language was justified. However, Isotype, as an ambitious version of logical positivism in practice distanced itself from the semiotic theories of language, and therefore his pictograms were defined as an independent set of signs rather than signs as a part of the language. This paper attempts to investigate intersemiotic translation in the form of Isotypes and trace the effects of logical positivism on Neurath’s concept of isotypes; the ‘international language’.Keywords: intersemiotic translation, isotype, logical positivism, Otto Neurath, translation studies
Procedia PDF Downloads 250402 Calculation of Inflation from Salaries Instead of Consumer Products: A Logical Exercise
Authors: E. Dahlen
Abstract:
Inflation can be calculated from either the prices of consumer products or from salaries. This paper presents a logical exercise that shows it is easier to calculate inflation from salaries than from consumer products. While the prices of consumer products may change due to technological advancement, such as automation, which must be corrected for, salaries do not. If technological advancements are not accounted for within calculations based on consumer product prices, inflation can be confused with real wage changes, since both inflation and real wage changes affect the prices of consumer products. The method employed in this paper is a logical exercise. Logical arguments are presented that suggest the existence of many different feasible ways by which inflation can be determined. Then a short mathematical exercise will be presented which shows that one of these methods –using salaries – contains the fewest number of unknown parameters, and hence, is the preferred method, since the risk of mistakes is lower. From the results, it can be concluded that salaries, rather than consumer products, should be used to calculate inflation.Keywords: inflation, logic, math, real wages
Procedia PDF Downloads 330401 Schema Therapy as Treatment for Adults with Autism Spectrum Disorder and Comorbid Personality Disorder: A Multiple Baseline Case Series Study Testing Cognitive-Behavioral and Experiential Interventions
Authors: Richard Vuijk, Arnoud Arntz
Abstract:
Rationale: To our knowledge treatment of personality disorder comorbidity in adults with autism spectrum disorder (ASD) is understudied and is still in its infancy: We do not know if treatment of personality disorders may be applicable to adults with ASD. In particular, it is unknown whether patients with ASD benefit from experiential techniques that are part of schema therapy developed for the treatment of personality disorders. Objective: The aim of the study is to investigate the efficacy of a schema mode focused treatment with adult clients with ASD and comorbid personality pathology (i.e. at least one personality disorder). Specifically, we investigate if they can benefit from both cognitive-behavioral, and experiential interventions. Study design: A multiple baseline case series study. Study population: Adult individuals (age > 21 years) with ASD and at least one personality disorder. Participants will be recruited from Sarr expertise center for autism in Rotterdam. The study requires 12 participants. Intervention: The treatment protocol consists of 35 weekly offered sessions, followed by 10 monthly booster sessions. A multiple baseline design will be used with baseline varying from 5 to 10 weeks, with weekly supportive sessions. After baseline, a 5-week exploration phase follows with weekly sessions during which current and past functioning, psychological symptoms, schema modes are explored, and information about the treatment will be given. Then 15 weekly sessions with cognitive-behavioral interventions and 15 weekly sessions with experiential interventions will be given. Finally, there will be a 10-month follow-up phase with monthly booster sessions. Participants are randomly assigned to baseline length, and respond weekly during treatment and monthly at follow-up on Belief Strength of negative core beliefs (by VAS), and fill out SMI, SCL-90 and SRS-A 7 times during screening procedure (i.e. before baseline), after baseline, after exploration, after cognitive and behavioral interventions, after experiential interventions, and after 5- and 10- month follow-up. The SCID-II will be administered during screening procedure (i.e. before baseline), at 5- and at 10-month follow-up. Main study parameters: The primary study parameter is negative core beliefs. Secondary study parameters include schema modes, personality disorder manifestations, psychological symptoms, and social interaction and communication. Discussion: To the best of author’s knowledge so far no study has been published on the application of schema mode focused interventions in adult patients with ASD and comorbid PD(s). This study offers the first systematic test of application of schema therapy for adults with ASD. The results of this study will provide initial evidence for the effectiveness of schema therapy in treating adults with both ASD and PD(s). The study intends to provide valuable information for future development and implementation of therapeutic interventions for adults with both ASD and PD(s).Keywords: adults, autism spectrum disorder, personality disorder, schema therapy
Procedia PDF Downloads 239400 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 66399 Parallel Version of Reinhard’s Color Transfer Algorithm
Authors: Abhishek Bhardwaj, Manish Kumar Bajpai
Abstract:
An image with its content and schema of colors presents an effective mode of information sharing and processing. By changing its color schema different visions and prospect are discovered by the users. This phenomenon of color transfer is being used by Social media and other channel of entertainment. Reinhard et al’s algorithm was the first one to solve this problem of color transfer. In this paper, we make this algorithm efficient by introducing domain parallelism among different processors. We also comment on the factors that affect the speedup of this problem. In the end by analyzing the experimental data we claim to propose a novel and efficient parallel Reinhard’s algorithm.Keywords: Reinhard et al’s algorithm, color transferring, parallelism, speedup
Procedia PDF Downloads 614398 Designing the Lesson Instructional Plans for Exploring the STEM Education and Creative Learning Processes to Students' Logical Thinking Abilities with Different Learning Outcomes in Chemistry Classes
Authors: Pajaree Naramitpanich, Natchanok Jansawang, Panwilai Chomchid
Abstract:
The aims of this are compared between the students’ logical thinking abilities of their learning for designing the 5-lesson instructional plans of the 2-instructional methods, namely; the STEM Education and the Creative Learning Process (CLP) for developing students’ logical thinking abilities that a sample consisted of 90 students from two chemistry classes of different learning outcomes in Wapi Phathum School with the cluster random sampling technique was used at the 11th grade level. To administer of their learning environments with the 45-experimenl student group by the STEM Education method and the 45-controlling student group by the Creative Learning Process. These learning different groups were obtained using the 5 instruments; the 5-lesson instructional plans of the STEM Education and the Creative Learning Process to enhance the logical thinking tests on Mineral issue were used. The efficiency of the Creative Learning Processes (CLP) Model and the STEM Education’s innovations of these each five instructional lesson plans based on criteria are higher than of 80/80 standard level with the IOC index from the expert educators. The averages mean scores of students’ learning achievement motives were assessed with the Pre and Post Techniques and Logical Thinking Ability Test (LTAT) and dependent t-test analysis were differentiated between the CLP and the STEM, significantly. Students’ perceptions of their chemistry classroom environment inventories with the MCI with the CLP and the STEM methods also were found, differently. Associations between students’ perceptions of their chemistry classroom learning environment inventories on the CLP Model and the STEM Education learning designs toward their logical thinking abilities toward chemistry, the predictive efficiency of R2 values indicate that 68% and 76% of the variances in students’ logical thinking abilities toward chemistry to their controlling and experimental chemistry classroom learning environmental groups with the MCI were correlated at .05 levels, significantly. Implementations of this result are showed the students’ learning by the CLP of the potential thinking life-changing roles in most their logical thinking abilities that it is revealed that the students perceive their abilities to be highly learning achievement in chemistry group are differentiated with the STEM education of students’ outcomes.Keywords: design, the lesson instructional plans, the stem education, the creative learning process, logical thinking ability, different, learning outcome, student, chemistry class
Procedia PDF Downloads 321397 Psychodidactic Strategies to Facilitate Flow of Logical Thinking in Preparation of Academic Documents
Authors: Deni Stincer Gomez, Zuraya Monroy Nasr, Luis Pérez Alvarez
Abstract:
The preparation of academic documents such as thesis, articles and research projects is one of the requirements of the higher educational level. These documents demand the implementation of logical argumentative thinking which is experienced and executed with difficulty. To mitigate the effect of these difficulties this study designed a thesis seminar, with which the authors have seven years of experience. It is taught in a graduate program in Psychology at the National Autonomous University of Mexico. In this study the authors use the Toulmin model as a mental heuristic and for the application of a set of psychodidactic strategies that facilitate the elaboration of the plot and culmination of the thesis. The efficiency in obtaining the degree in the groups exposed to the seminar has increased by 94% compared to the 10% that existed in the generations that were not exposed to the seminar. In this article the authors will emphasize the psychodidactic strategies used. The Toulmin model alone does not guarantee the success achieved. A set of actions of a psychological nature (almost psychotherapeutic) and didactics of the teacher also seem to contribute. These are actions that derive from an understanding of the psychological, epistemological and ontogenetic obstacles and the most frequent errors in which thought tends to fall when it is demanded a logical course. The authors have grouped the strategies into three groups: 1) strategies to facilitate logical thinking, 2) strategies to strengthen the scientific self and 3) strategies to facilitate the act of writing the text. In this work the authors delve into each of them.Keywords: psychodidactic strategies, logical thinking, academic documents, Toulmin model
Procedia PDF Downloads 179396 Implementing a Database from a Requirement Specification
Abstract:
Creating a database scheme is essentially a manual process. From a requirement specification, the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time-consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that the first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore, this method is a step forward in finding a solution that requires little or no user intervention.Keywords: information extraction, natural language processing, relation extraction
Procedia PDF Downloads 261395 Effective Editable Emoticon Description Schema for Mobile Applications
Authors: Jiwon Lee, Si-hwan Jang, Sanghyun Joo
Abstract:
The popularity of emoticons are on the rise since the mobile messengers are generalized. At the same time, few problems of emoticons are also occurred due to innate characteristics of emoticons. Too many emoticons make difficult people to select one which is well-suited for user's intention. On the contrary to this, sometimes user cannot find the emoticon which expresses user's exact intention. Poor information delivery of emoticon is another problem due to a major part of current emoticons are focused on emotion delivery. In this situation, we propose a new concept of emoticons, editable emoticons, to solve above drawbacks of emoticons. User can edit the components inside the proposed editable emoticon and send it to express his exact intention. By doing so, the number of editable emoticons can be maintained reasonable, and it can express user's exact intention. Further, editable emoticons can be used as information deliverer according to user's intention and editing skills. In this paper, we propose the concept of editable emoticons and schema based editable emoticon description method. The proposed description method is 200 times superior to the compared screen capturing method in the view of transmission bandwidth. Further, the description method is designed to have compatibility since it follows MPEG-UD international standard. The proposed editable emoticons can be exploited not only mobile applications, but also various fields such as education and medical field.Keywords: description schema, editable emoticon, emoticon transmission, mobile applications
Procedia PDF Downloads 297394 Background Knowledge and Reading Comprehension in ELT Classes: A Pedagogical Perspective
Authors: Davoud Ansari Kejal, Meysam Sabour
Abstract:
For long, there has been a belief that a reader can easily comprehend a text if he is strong enough in vocabulary and grammatical knowledge but there was no account for the ability of understanding different subjects based on readers’ understanding of the surrounding world which is called world background knowledge. This paper attempts to investigate the reading comprehension process applying the schema theory as an influential factor in comprehending texts, in order to prove the important role of background knowledge in reading comprehension. Based on the discussion, some teaching methods are suggested for employing world background knowledge for an elaborated teaching of reading comprehension in an active learning environment in EFL classes.Keywords: background knowledge, reading comprehension, schema theory, ELT classes
Procedia PDF Downloads 456393 The Theory of Number "0"
Authors: Iryna Shevchenko
Abstract:
The science of mathematics was originated at the order of count of objects and subsequently for the measurement of size and quality of objects using the logical or abstract means. The laws of mathematics are based on the study of absolute values. The number 0 or "nothing" is the purely logical (as the opposite to absolute) value as the "nothing" should always assume the space for the something that had existed there; otherwise the "something" would never come to existence. In this work we are going to prove that the number "0" is the abstract (logical) and not an absolute number and it has the absolute value of “∞” (infinity). Therefore, the number "0" might not stand in the row of numbers that symbolically represents the absolute values, as it would be the mathematically incorrect. The symbolical value of number "0" in the row of numbers could be represented with symbol "∞" (infinity). As a result, we have the mathematical row of numbers: epsilon, ...4, 3, 2, 1, ∞. As the conclusions of the theory of number “0” we presented the statements: multiplication and division by fractions of numbers is illegal operation and the mathematical division by number “0” is allowed.Keywords: illegal operation of division and multiplication by fractions of number, infinity, mathematical row of numbers, theory of number “0”
Procedia PDF Downloads 552392 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 430391 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses
Authors: Ouzayr Rabhi, Ibtissam Arrassen
Abstract:
To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML
Procedia PDF Downloads 160390 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema
Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy
Abstract:
Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet
Procedia PDF Downloads 311389 The Probability Foundation of Fundamental Theoretical Physics
Authors: Quznetsov Gunn
Abstract:
In the study of the logical foundations of probability theory, it was found that the terms and equations of the fundamental theoretical physics represent terms and theorems of the classical probability theory, more precisely, of that part of this theory, which considers the probability of dot events in the 3 + 1 space-time. In particular, the masses, moments, energies, spins, etc. turn out of parameters of probability distributions such events. The terms and the equations of the electroweak and of the quark-gluon theories turn out the theoretical-probabilistic terms and theorems. Here the relation of a neutrino to his lepton becomes clear, the W and Z bosons masses turn out dynamic ones, the cause of the asymmetry between particles and antiparticles is the impossibility of the birth of single antiparticles. In addition, phenomena such as confinement and asymptotic freedom receive their probabilistic explanation. And here we have the logical foundations of the gravity theory with phenomena dark energy and dark matter.Keywords: classical theory of probability, logical foundation of fundamental theoretical physics, masses, moments, energies, spins
Procedia PDF Downloads 295388 The Role of Attachment Styles, Gender Schemas, Sexual Self Schemas, and Body Exposures During Sexual Activity in Sexual Function, Marital Satisfaction, and Sexual Self-Esteem
Authors: Hossein Shareh, Farhad Seifi
Abstract:
The present study was to examine the role of attachment styles, gender schemas, sexual-self schemas, and body image during sexual activity in sexual function, marital satisfaction, and sexual self-esteem. The sampling method was among married women who were living in Mashhad; a snowball selected 765 people. Questionnaires and measures of adult attachment style (AAS), Bem Sex Role Inventory (BSRI), sexual self-schema (SSS), body exposure during sexual activity questionnaire (BESAQ), sexual function female inventory (FSFI), a short form of sexual self-esteem (SSEI-W-SF) and marital satisfaction (Enrich) were completed by participants. Data analysis using Pearson correlation and hierarchical regression and case analysis was performed by SPSS-19 software. The results showed that there is a significant correlation (P <0.05) between attachment and sexual function (r=0.342), marital satisfaction (r=0.351) and sexual self-esteem (r =0.292). A correlation (P <0.05) was observed between sexual schema (r=0.342) and sexual esteem (r=0.31). A meaningful correlation (P <0.05) exists between gender stereotypes and sexual function (r=0.352). There was a significant inverse correlation (P <0.05) between body image and their performance during sexual activity (r=0.41). There is no significant relationship between gender schemas, sexual schemas, body image, and marital satisfaction, and no relation was found between gender schemas, body image, and sexual self-esteem. Also, the result of the regression showed that attachment styles, gender schemas, sexual self- schemas, and body exposures during sexual activity are predictable in sexual function, and marital satisfaction can be predicted by attachment style and gender schema. Somewhat, sexual self-esteem can be expected by attachment style and gender schemas.Keywords: attachment styles, gender and sexual schemas, body image, sexual function, marital satisfaction, sexual self-esteem
Procedia PDF Downloads 39387 Towards Security in Virtualization of SDN
Authors: Wanqing You, Kai Qian, Xi He, Ying Qian
Abstract:
In this paper, the potential security issues brought by the virtualization of a Software Defined Networks (SDN) would be analyzed. The virtualization of SDN is achieved by FlowVisor (FV). With FV, a physical network is divided into multiple isolated logical networks while the underlying resources are still shared by different slices (isolated logical networks). However, along with the benefits brought by network virtualization, it also presents some issues regarding security. By examining security issues existing in an OpenFlow network, which uses FlowVisor to slice it into multiple virtual networks, we hope we can get some significant results and also can get further discussions among the security of SDN virtualization.Keywords: SDN, network, virtualization, security
Procedia PDF Downloads 428386 The Game of Dominoes as Teaching-Learning Method of Basic Concepts of Differential Calculus
Authors: Luis Miguel Méndez Díaz
Abstract:
In this article, a mathematics teaching-learning strategy will be presented, specifically differential calculus in one variable, in a fun and competitive space in which the action on the part of the student is manifested and not only the repetition of information on the part of the teacher. Said action refers to motivating, problematizing, summarizing, and coordinating a game of dominoes whose thematic cards are designed around the basic and main contents of differential calculus. The strategies for teaching this area are diverse and precisely the game of dominoes is one of the most used strategies in the practice of mathematics because it stimulates logical reasoning and mental abilities. The objective on this investigation is to identify the way in which the game of dominoes affects the learning and understanding of fundamentals concepts of differential calculus in one variable through experimentation carried out on students of the first semester of the School of Engineering and Sciences of the Technological Institute of Monterrey Campus Querétaro. Finally, the results of this study will be presented and the use of this strategy in other topics around mathematics will be recommended to facilitate logical and meaningful learning in students.Keywords: collaborative learning, logical-mathematical intelligence, mathematical games, multiple intelligences
Procedia PDF Downloads 84385 A Novel Algorithm for Parsing IFC Models
Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai
Abstract:
Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD). Procedia PDF Downloads 300