Search results for: transposition tables
326 Model Estimation and Error Level for Okike’s Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba E. J. D.
Abstract:
The researcher has developed a new encryption technique known as Merged Irregular Transposition Cipher. In this cipher method of encryption, a message to be encrypted is split into parts and each part encrypted separately. Before the encrypted message is transmitted to the recipient(s), the positions of the split in the encrypted messages could be swapped to ensure more security. This work seeks to develop a model by considering the split number, S and the average number of characters per split, L as the message under consideration is split from 2 through 10. Again, after developing the model, the error level in the model would be determined.Keywords: merged irregular transposition, error level, model estimation, message splitting
Procedia PDF Downloads 314325 Determination of Complexity Level in Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 345324 Determination of Complexity Level in Okike's Merged Irregular Transposition Cipher
Authors: Okike Benjami, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In other to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often decrypted by adversaries with ease. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Okike’s Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 290323 Comprehensive Studio Tables: Improving Performance and Quality of Student's Work in Architecture Studio
Authors: Maryam Kalkatechi
Abstract:
Architecture students spent most of their qualitative time in studios during their years of study. The studio table’s importance as furniture in the studio is that it elevates the quality of the projects and positively influences the student’s productivity. This paper first describes the aspects considered in designing comprehensive studio table and later details on each aspect. Comprehensive studio tables are meant to transform the studio space to an efficient yet immense place of learning, collaboration, and participation. One aspect of these tables is that the surface transforms to a place of accommodation for design conversations, the other aspect of these tables is the efficient interactive platform of the tools. The discussion factors of the comprehensive studio include; the comprehensive studio setting of workspaces, the arrangement of the comprehensive studio tables, the collaboration aspects in the studio, the studio display and lightings shaped by the tables and lighting of the studio.Keywords: studio tables, student performance, productivity, hologram, 3D printer
Procedia PDF Downloads 189322 Learning to Teach in Large Classrooms: Training Faculty Members from Milano Bicocca University, from Didactic Transposition to Communication Skills
Authors: E. Nigris, F. Passalacqua
Abstract:
Relating to the recent researches in the field of faculty development, this paper aims to present a pilot training programme realized at the University of Milano-Bicocca to improve teaching skills of faculty members. A total of 57 professors (both full professors and associate professors) were trained during the pilot programme in three editions of the workshop, focused on promoting skills for teaching large classes. The study takes into account: 1) the theoretical framework of the programme which combines the recent tradition about professional development and the research on in-service training of school teachers; 2) the structure and the content of the training programme, organized in a 12 hours-full immersion workshop and in individual consultations; 3) the educational specificity of the training programme which is based on the relation between 'general didactic' (active learning metholodies; didactic communication) and 'disciplinary didactics' (didactic transposition and reconstruction); 4) results about the impact of the training programme, both related to the workshop and the individual consultations. This study aims to provide insights mainly on two levels of the training program’s impact ('behaviour change' and 'transfer') and for this reason learning outcomes are evaluated by different instruments: a questionnaire filled out by all 57 participants; 12 in-depth interviews; 3 focus groups; conversation transcriptions of workshop activities. Data analysis is based on a descriptive qualitative approach and it is conducted through thematic analysis of the transcripts using analytical categories derived principally from the didactic transposition theory. The results show that the training programme developed effectively three major skills regarding different stages of the 'didactic transposition' process: a) the content selection; a more accurated selection and reduction of the 'scholarly knowledge', conforming to the first stage of the didactic transposition process; b) the consideration of students’ prior knowledge and misconceptions within the lesson design, in order to connect effectively the 'scholarly knowledge' to the 'knowledge to be taught' (second stage of the didactic transposition process); c) the way of asking questions and managing discussion in large classrooms, in line with the transformation of the 'knowledge to be taught' in 'taught knowledge' (third stage of the didactic transposition process).Keywords: didactic communication, didactic transposition, instructional development, teaching large classroom
Procedia PDF Downloads 139321 A Conglomerate of Multiple Optical Character Recognition Table Detection and Extraction
Authors: Smita Pallavi, Raj Ratn Pranesh, Sumit Kumar
Abstract:
Information representation as tables is compact and concise method that eases searching, indexing, and storage requirements. Extracting and cloning tables from parsable documents is easier and widely used; however, industry still faces challenges in detecting and extracting tables from OCR (Optical Character Recognition) documents or images. This paper proposes an algorithm that detects and extracts multiple tables from OCR document. The algorithm uses a combination of image processing techniques, text recognition, and procedural coding to identify distinct tables in the same image and map the text to appropriate the corresponding cell in dataframe, which can be stored as comma-separated values, database, excel, and multiple other usable formats.Keywords: table extraction, optical character recognition, image processing, text extraction, morphological transformation
Procedia PDF Downloads 145320 Pattern in Splitting Sequence in Okike’s Merged Irregular Transposition Cipher for Encrypting Cyberspace Messages
Authors: Okike Benjamin, E. J. D. Garba
Abstract:
The protection of sensitive information against unauthorized access or fraudulent changes has been of prime concern throughout the centuries. Modern communication techniques, using computers connected through networks, make all data even more vulnerable to these threats. The researchers in this work propose a new encryption technique to be known as Merged Irregular Transposition Cipher. In this proposed encryption technique, a message to be encrypted will first of all be split into multiple parts depending on the length of the message. After the split, different keywords are chosen to encrypt different parts of the message. After encrypting all parts of the message, the positions of the encrypted message could be swapped to other position thereby making it very difficult to decrypt by any unauthorized user.Keywords: information security, message splitting, pattern, sequence
Procedia PDF Downloads 290319 The Didactic Transposition in Brazilian High School Physics Textbooks: A Comparative Study of Didactic Materials
Authors: Leandro Marcos Alves Vaz
Abstract:
In this article, we analyze the different approaches to the topic Magnetism of Matter in physics textbooks of Brazilian schools. For this, we compared the approach to the concepts of the magnetic characteristics of materials (diamagnetism, paramagnetism, ferromagnetism and antiferromagnetism) in different sources of information and in different levels of education, from Higher Education to High School. In this sense, we used as reference the theory of the Didactic Transposition of Yves Chevallard, a French educational theorist, who conceived in his theory three types of knowledge – Scholarly Knowledge, Knowledge to be taught and Taught Knowledge – related to teaching practice. As a research methodology, from the reading of the works used in teacher training and those destined to basic education students, we compared the treatment of a higher education physics book, a scientific article published in a Brazilian journal of the educational area, and four high school textbooks, in order to establish in which there is a greater or lesser degree of approximation with the knowledge produced by the scholars – scholarly knowledge – or even with the knowledge to be taught (to that found in books intended for teaching). Thus, we evaluated the level of proximity of the subjects conveyed in high school and higher education, as well as the relevance that some textbook authors give to the theme.Keywords: Brazilian physics books, didactic transposition, magnetism of matter, teaching of physics
Procedia PDF Downloads 301318 Madame Bovary in Transit: from Novel to Graphic Novel
Authors: Hania Pasandi
Abstract:
Since its publication in 1856, Madame Bovary has established itself as one of the most adapted texts of French literature. Some eighteen film adaptations and twenty-seven rewritings of Madame Bovary in fiction to date shows a great enthusiasm for recreating Flaubert’s masterpiece in a variety of mediums. Posy Simmonds’ 1999 graphic novel, Gemma Bovery stands out among these adaptations as the graphic novel with its visual and narrative structure offers a new reading experience of Madame Bovary, while combining Emma Bovary’s elements with contemporary social, cultural, and artistic discourses. This paper studies the transposition of Flaubert’s Madame Bovary (1857) to late twentieth-century Britain in Posy Simmonds’ 1999 graphic novel, Gemma Bovery by exploring how it borrows the essential flaubertian themes, from its source text to incorporate it with contemporary cultural trends.Keywords: graphic novel, Gemma Bovery, Madame Bovary, transposition
Procedia PDF Downloads 153317 A Very Efficient Pseudo-Random Number Generator Based On Chaotic Maps and S-Box Tables
Authors: M. Hamdi, R. Rhouma, S. Belghith
Abstract:
Generating random numbers are mainly used to create secret keys or random sequences. It can be carried out by various techniques. In this paper we present a very simple and efficient pseudo-random number generator (PRNG) based on chaotic maps and S-Box tables. This technique adopted two main operations one to generate chaotic values using two logistic maps and the second to transform them into binary words using random S-Box tables. The simulation analysis indicates that our PRNG possessing excellent statistical and cryptographic properties.Keywords: Random Numbers, Chaotic map, S-box, cryptography, statistical tests
Procedia PDF Downloads 365316 Food Composition Tables Used as an Instrument to Estimate the Nutrient Ingest in Ecuador
Authors: Ortiz M. Rocío, Rocha G. Karina, Domenech A. Gloria
Abstract:
There are several tools to assess the nutritional status of the population. A main instrument commonly used to build those tools is the food composition tables (FCT). Despite the importance of FCT, there are many error sources and variability factors that can be presented on building those tables and can lead to an under or over estimation of ingest of nutrients of a population. This work identified different food composition tables used as an instrument to estimate the nutrient ingest in Ecuador.The collection of data for choosing FCT was made through key informants –self completed questionnaires-, supplemented with institutional web research. A questionnaire with general variables (origin, year of edition, etc) and methodological variables (method of elaboration, information of the table, etc) was passed to the identified FCT. Those variables were defined based on an extensive literature review. A descriptive analysis of content was performed. Ten printed tables and three databases were reported which were all indistinctly treated as food composition tables. We managed to get information from 69% of the references. Several informants referred to printed documents that were not accessible. In addition, searching the internet was not successful. Of the 9 final tables, n=8 are from Latin America, and, n= 5 of these were constructed by indirect method (collection of already published data) having as a main source of information a database from the United States department of agriculture USDA. One FCT was constructed by using direct method (bromatological analysis) and has its origin in Ecuador. The 100% of the tables made a clear distinction of the food and its method of cooking, 88% of FCT expressed values of nutrients per 100g of edible portion, 77% gave precise additional information about the use of the table, and 55% presented all the macro and micro nutrients on a detailed way. The more complete FCT were: INCAP (Central America), Composition of foods (Mexico). The more referred table was: Ecuadorian food composition table of 1965 (70%). The indirect method was used for most tables within this study. However, this method has the disadvantage that it generates less reliable food composition tables because foods show variations in composition. Therefore, a database cannot accurately predict the composition of any isolated sample of a food product.In conclusion, analyzing the pros and cons, and, despite being a FCT elaborated by using an indirect method, it is considered appropriate to work with the FCT of INCAP Central America, given the proximity to our country and a food items list that is very similar to ours. Also, it is imperative to have as a reference the table of composition for Ecuadorian food, which, although is not updated, was constructed using the direct method with Ecuadorian foods. Hence, both tables will be used to elaborate a questionnaire with the purpose of assessing the food consumption of the Ecuadorian population. In case of having disparate values, we will proceed by taking just the INCAP values because this is an updated table.Keywords: Ecuadorian food composition tables, FCT elaborated by direct method, ingest of nutrients of Ecuadorians, Latin America food composition tables
Procedia PDF Downloads 432315 Anaesthetic Management of Congenitally Corrected Transposition of Great Arteries with Complete Heart Block in a Parturient for Emergency Caesarean Section
Authors: Lokvendra S. Budania, Yogesh K Gaude, Vamsidhar Chamala
Abstract:
Introduction: Congenitally corrected transposition of great arteries (CCTGA) is a complex congenital heart disease where there are both atrioventricular and ventriculoarterial discordances, usually accompanied by other cardiovascular malformations. Case Report: A 24-year-old primigravida known case of CCTGA at 37 weeks of gestation was referred to our hospital for safe delivery. Her electrocardiogram showed HR-40/pm, echocardiography showed Ejection Fraction of 65% and CCTGA. Temporary pacemaker was inserted by cardiologist in catheterization laboratory, before giving trial of labour in view of complete heart block. She was planned for normal delivery, but emergency Caesarean section was planned due to non-reassuring foetal Cardiotocography Pre-op vitals showed PR-50 bpm with temporary pacemaker, Blood pressure-110/70 mmHg, SpO2-99% on room air. Nil per oral was inadequate. Patency of two peripheral IV cannula checked and left radial arterial line secured. Epidural Anaesthesia was planned, and catheter was placed at L2-L3. Test dose was given, Anaesthesia was provided with 5ml + 5ml of 2% Lignocaine with 25 mcg Fentanyl and further 2.5Ml of 0.5% Bupivacaine was given to achieve a sensory level of T6. Cesarean section was performed and baby was delivered. Cautery was avoided during this procedure. IV Oxytocin (15U) was added to 500 mL of ringer’s lactate. Hypotension was treated with phenylephrine boluses. Patient was shifted to post-operative care unit and later to high dependency unit for monitoring. Post op vitals remained stable. Temporary pacemaker was removed after 24 hours of surgery. Her post-operative period was uneventful and discharged from hospital. Conclusion: Rare congenital cardiac disorders require detail knowledge of pathophysiology and associated comorbidities with the disease. Meticulously planned and carefully titrated neuraxial techniques will be beneficial for such cases.Keywords: congenitally corrected transposition of great arteries, complete heart block, emergency LSCS, epidural anaesthesia
Procedia PDF Downloads 131314 Assessing the Mechanical Safety, Durability, Strength, and Stability of Wooden Furniture Produced in Ghana
Authors: Haruna Seidu, Francis Wilson Owusu, Michael Mensah, Felix Boakye, James Korang, Safia Ibrahim
Abstract:
Over the years, wooden furniture produced in Ghana had no means of testing their products against standards. It was therefore difficult for such furniture producers to know whether their products conform to international standards. The setting up of the ISO 17025 compliant laboratory has become a reference and accessing point for determining the quality of the furniture they produce. The objective of the study includes the determination of mechanical safety, durability, strength, and stability of wooden furniture produced in Ghana. Twelve wooden furniture manufacturers were randomly selected to design furniture (chairs and tables) for testing. 9 out of the 12 produced chairs, and three provided tables. Standard testing methods were used in this experiment, including GS EN 581-1, GS EN 581-2, and GS EN 581-3. The test results analysis indicates 55.6% of the chairs tested passed all applicable tests. 66.7% of tables tested passed all the applicable tests. The percentage pass and failure of the 12 furniture were 58.3% and 41.7% respectively. In conclusion, chair manufacturers had good designs that withstand the standard testing of strength and durability; most failures occurred largely as a result of poor stability designs adopted for the construction of the chairs and tables. It was observed that the manufacturers did not use the software in designing their furniture.Keywords: durability, international standards, mechanical safety, wooden furniture design
Procedia PDF Downloads 334313 Spatial Data Mining by Decision Trees
Authors: Sihem Oujdi, Hafida Belbachir
Abstract:
Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining
Procedia PDF Downloads 613312 Constructing White-Box Implementations Based on Threshold Shares and Composite Fields
Authors: Tingting Lin, Manfred von Willich, Dafu Lou, Phil Eisen
Abstract:
A white-box implementation of a cryptographic algorithm is a software implementation intended to resist extraction of the secret key by an adversary. To date, most of the white-box techniques are used to protect block cipher implementations. However, a large proportion of the white-box implementations are proven to be vulnerable to affine equivalence attacks and other algebraic attacks, as well as differential computation analysis (DCA). In this paper, we identify a class of block ciphers for which we propose a method of constructing white-box implementations. Our method is based on threshold implementations and operations in composite fields. The resulting implementations consist of lookup tables and few exclusive OR operations. All intermediate values (inputs and outputs of the lookup tables) are masked. The threshold implementation makes the distribution of the masked values uniform and independent of the original inputs, and the operations in composite fields reduce the size of the lookup tables. The white-box implementations can provide resistance against algebraic attacks and DCA-like attacks.Keywords: white-box, block cipher, composite field, threshold implementation
Procedia PDF Downloads 170311 Design and Development of Real-Time Optimal Energy Management System for Hybrid Electric Vehicles
Authors: Masood Roohi, Amir Taghavipour
Abstract:
This paper describes a strategy to develop an energy management system (EMS) for a charge-sustaining power-split hybrid electric vehicle. This kind of hybrid electric vehicles (HEVs) benefit from the advantages of both parallel and series architecture. However, it gets relatively more complicated to manage power flow between the battery and the engine optimally. The applied strategy in this paper is based on nonlinear model predictive control approach. First of all, an appropriate control-oriented model which was accurate enough and simple was derived. Towards utilization of this controller in real-time, the problem was solved off-line for a vast area of reference signals and initial conditions and stored the computed manipulated variables inside look-up tables. Look-up tables take a little amount of memory. Also, the computational load dramatically decreased, because to find required manipulated variables the controller just needed a simple interpolation between tables.Keywords: hybrid electric vehicles, energy management system, nonlinear model predictive control, real-time
Procedia PDF Downloads 354310 The Impact of University League Tables on the Development of Non-Elite Universities. A Case Study of England
Authors: Lois Cheung
Abstract:
This article examines the impact of League Tables on non-elite universities in the English higher education system. The purpose of this study is to explore the use of rankings in strategic planning by low-ranked universities in this highly competitive higher education market. A sample of non-elite universities was selected for a content analysis based on the measures used by The Guardian rankings. Interestingly, these universities care about their rankings within a single national system. The content analysis appears to be an effective approach to investigating the presence of such influences. It is particularly noteworthy that all sampled universities use these measure terminologies in their strategic plans, missions and news coverage on their institutional web-pages. This analysis may be an example of the key challenges that many low-ranking universities in England are probably facing in the highly competitive and diversified higher education market. These universities use rankings to communicate with their stakeholders, mainly students, in order to fill places to secure their major source of funding. The study concludes with comments on the likely effects of the rankings paradigm in undermining the contributions of non-elite universities.Keywords: League tables, measures, post-1992 universities, ranking, strategy
Procedia PDF Downloads 183309 Commercial Winding for Superconducting Cables and Magnets
Authors: Glenn Auld Knierim
Abstract:
Automated robotic winding of high-temperature superconductors (HTS) addresses precision, efficiency, and reliability critical to the commercialization of products. Today’s HTS materials are mature and commercially promising but require manufacturing attention. In particular to the exaggerated rectangular cross-section (very thin by very wide), winding precision is critical to address the stress that can crack the fragile ceramic superconductor (SC) layer and destroy the SC properties. Damage potential is highest during peak operations, where winding stress magnifies operational stress. Another challenge is operational parameters such as magnetic field alignment affecting design performance. Winding process performance, including precision, capability for geometric complexity, and efficient repeatability, are required for commercial production of current HTS. Due to winding limitations, current HTS magnets focus on simple pancake configurations. HTS motors, generators, MRI/NMR, fusion, and other projects are awaiting robotic wound solenoid, planar, and spherical magnet configurations. As with conventional power cables, full transposition winding is required for long length alternating current (AC) and pulsed power cables. Robotic production is required for transposition, periodic swapping of cable conductors, and placing into precise positions, which allows power utility required minimized reactance. A full transposition SC cable, in theory, has no transmission length limits for AC and variable transient operation due to no resistance (a problem with conventional cables), negligible reactance (a problem for helical wound HTS cables), and no long length manufacturing issues (a problem with both stamped and twisted stacked HTS cables). The Infinity Physics team is solving manufacturing problems by developing automated manufacturing to produce the first-ever reliable and utility-grade commercial SC cables and magnets. Robotic winding machines combine mechanical and process design, specialized sense and observer, and state-of-the-art optimization and control sequencing to carefully manipulate individual fragile SCs, especially HTS, to shape previously unattainable, complex geometries with electrical geometry equivalent to commercially available conventional conductor devices.Keywords: automated winding manufacturing, high temperature superconductor, magnet, power cable
Procedia PDF Downloads 141308 Weakly Solving Kalah Game Using Artificial Intelligence and Game Theory
Authors: Hiba El Assibi
Abstract:
This study aims to weakly solve Kalah, a two-player board game, by developing a start-to-finish winning strategy using an optimized Minimax algorithm with Alpha-Beta Pruning. In weakly solving Kalah, our focus is on creating an optimal strategy from the game's beginning rather than analyzing every possible position. The project will explore additional enhancements like symmetry checking and code optimizations to speed up the decision-making process. This approach is expected to give insights into efficient strategy formulation in board games and potentially help create games with a fair distribution of outcomes. Furthermore, this research provides a unique perspective on human versus Artificial Intelligence decision-making in strategic games. By comparing the AI-generated optimal moves with human choices, we can explore how seemingly advantageous moves can, in the long run, be harmful, thereby offering a deeper understanding of strategic thinking and foresight in games. Moreover, this paper discusses the evaluation of our strategy against existing methods, providing insights on performance and computational efficiency. We also discuss the scalability of our approach to the game, considering different board sizes (number of pits and stones) and rules (different variations) and studying how that affects performance and complexity. The findings have potential implications for the development of AI applications in strategic game planning, enhancing our understanding of human cognitive processes in game settings, and offer insights into creating balanced and engaging game experiences.Keywords: minimax, alpha beta pruning, transposition tables, weakly solving, game theory
Procedia PDF Downloads 55307 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table
Authors: David A. Swanson, Lucky M. Tedrow
Abstract:
Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population
Procedia PDF Downloads 330306 Training to Evaluate Creative Activity in a Training Context, Analysis of a Learner Evaluation Model
Authors: Massy Guillaume
Abstract:
Introduction: The implementation of creativity in educational policies or curricula raises several issues, including the evaluation of creativity and the means to do so. This doctoral research focuses on the appropriation and transposition of creativity assessment models by future teachers. Our objective is to identify the elements of the models that are most transferable to practice in order to improve their implementation in the students' curriculum while seeking to create a new model for assessing creativity in the school environment. Methods: In order to meet our objective, this preliminary quantitative exploratory study by questionnaire was conducted at two points in the participants' training: at the beginning of the training module and throughout the practical work. The population is composed of 40 people of diverse origins with an average age of 26 (s:8,623) years. In order to be as close as possible to our research objective and to test our questionnaires, we set up a pre-test phase during the spring semester of 2022. Results: The results presented focus on aspects of the OECD Creative Competencies Assessment Model. Overall, 72% of participants support the model's focus on skill levels as appropriate for the school context. More specifically, the data indicate that the separation of production and process in the rubric facilitates observation by the assessor. From the point of view of transposing the grid into teaching practice, the participants emphasised that production is easier to plan and observe in students than in the process. This difference is reinforced by a lack of knowledge about certain concepts such as innovation or risktaking in schools. Finally, the qualitative results indicate that the addition of multiple levels of competencies to the OECD rubric would allow for better implementation in the classroom. Conclusion: The identification by the students of the elements allowing the evaluation of creativity in the school environment generates an innovative approach to the training contents. These first data, from the test phase of our research, demonstrate the difficulty that exists between the implementation of an evaluation model in a training program and its potential transposition by future teachers.Keywords: creativity, evaluation, schooling, training
Procedia PDF Downloads 95305 Measurement Tools of the Maturity Model for IT Service Outsourcing in Higher Education Institutions
Authors: Victoriano Valencia García, Luis Usero Aragonés, Eugenio J. Fernández Vicente
Abstract:
Nowadays, the successful implementation of ICTs is vital for almost any kind of organization. Good governance and ICT management are essential for delivering value, managing technological risks, managing resources and performance measurement. In addition, outsourcing is a strategic IT service solution which complements IT services provided internally in organizations. This paper proposes the measurement tools of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. These measurement tools allow independent validation and practical application in the field of higher education, using a questionnaire, metrics tables, and continuous improvement plan tables as part of the measurement process. Guidelines and standards are proposed in the model for facilitating adaptation to universities and achieving excellence in the outsourcing of IT services.Keywords: IT governance, IT management, IT services, outsourcing, maturity model, measurement tools
Procedia PDF Downloads 593304 Programming Language Extension Using Structured Query Language for Database Access
Authors: Chapman Eze Nnadozie
Abstract:
Relational databases constitute a very vital tool for the effective management and administration of both personal and organizational data. Data access ranges from a single user database management software to a more complex distributed server system. This paper intends to appraise the use a programming language extension like structured query language (SQL) to establish links to a relational database (Microsoft Access 2013) using Visual C++ 9 programming language environment. The methodology used involves the creation of tables to form a database using Microsoft Access 2013, which is Object Linking and Embedding (OLE) database compliant. The SQL command is used to query the tables in the database for easy extraction of expected records inside the visual C++ environment. The findings of this paper reveal that records can easily be accessed and manipulated to filter exactly what the user wants, such as retrieval of records with specified criteria, updating of records, and deletion of part or the whole records in a table.Keywords: data access, database, database management system, OLE, programming language, records, relational database, software, SQL, table
Procedia PDF Downloads 187303 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014
Authors: Alexiou Dimitra, Fragkaki Maria
Abstract:
The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics
Procedia PDF Downloads 512302 New Two-Way Map-Reduce Join Algorithm: Hash Semi Join
Authors: Marwa Hussein Mohamed, Mohamed Helmy Khafagy, Samah Ahmed Senbel
Abstract:
Map Reduce is a programming model used to handle and support massive data sets. Rapidly increasing in data size and big data are the most important issue today to make an analysis of this data. map reduce is used to analyze data and get more helpful information by using two simple functions map and reduce it's only written by the programmer, and it includes load balancing , fault tolerance and high scalability. The most important operation in data analysis are join, but map reduce is not directly support join. This paper explains two-way map-reduce join algorithm, semi-join and per split semi-join, and proposes new algorithm hash semi-join that used hash table to increase performance by eliminating unused records as early as possible and apply join using hash table rather than using map function to match join key with other data table in the second phase but using hash tables isn't affecting on memory size because we only save matched records from the second table only. Our experimental result shows that using a hash table with hash semi-join algorithm has higher performance than two other algorithms while increasing the data size from 10 million records to 500 million and running time are increased according to the size of joined records between two tables.Keywords: map reduce, hadoop, semi join, two way join
Procedia PDF Downloads 514301 Information Processing and Visual Attention: An Eye Tracking Study on Nutrition Labels
Authors: Rosa Hendijani, Amir Ghadimi Herfeh
Abstract:
Nutrition labels are diet-related health policies. They help individuals improve food-choice decisions and reduce intake of calories and unhealthy food elements, like cholesterol. However, many individuals do not pay attention to nutrition labels or fail to appropriately understand them. According to the literature, thinking and cognitive styles can have significant effects on attention to nutrition labels. According to the author's knowledge, the effect of global/local processing on attention to nutrition labels have not been previously studied. Global/local processing encourages individuals to attend to the whole/specific parts of an object and can have a significant impact on people's visual attention. In this study, this effect was examined with an experimental design using the eye-tracking technique. The research hypothesis was that individuals with local processing would pay more attention to nutrition labels, including nutrition tables and traffic lights. An experiment was designed with two conditions: global and local information processing. Forty participants were randomly assigned to either global or local conditions, and their processing style was manipulated accordingly. Results supported the hypothesis for nutrition tables but not for traffic lights.Keywords: eye-tracking, nutrition labelling, global/local information processing, individual differences
Procedia PDF Downloads 161300 Effect of Perception on People’s Behavior in Public Space
Authors: Morteza Maleki
Abstract:
In the present study is that it tried to behave in the environment to be monitored and the respective roles of environment (assumed as a vessel) and human beings (assumed as occupants of this vessel) inevitably create effects which can be expressed as various behaviors on the part of human being. The mutual relationship between man and his environment is exhibited through perceptions, behaviors, subjective images, activities, etc. This study investigates the conceptual dimension in the form of the four components of readability, sense of place, identity, and Tenability (tenability) at the Ahmadabad Axis in Mashhad. The theoretical fundamentals and the data regarding the status quo were presented through the descriptive method and the proposed policies were derived through analyzing the available status quo information. The required data were gathered from library resources and documents related to the studied area as well as from instruments used in field methods such as questionnaires. Upon conducting the necessary investigation, the conceptual dimension within the design area was analyzed. The SWOT table was presented, and the results obtained for improving environmental perception were arranged in the form of policy-making tables and operational projects tables for improving the sense of place, creating imagery, and other investigated components.Keywords: public space, perception, environment, behavior
Procedia PDF Downloads 393299 Left to Right-Right Most Parsing Algorithm with Lookahead
Authors: Jamil Ahmed
Abstract:
Left to Right-Right Most (LR) parsing algorithm is a widely used algorithm of syntax analysis. It is contingent on a parsing table, whereas the parsing tables are extracted from the grammar. The parsing table specifies the actions to be taken during parsing. It requires that the parsing table should have no action conflicts for the same input symbol. This requirement imposes a condition on the class of grammars over which the LR algorithms work. However, there are grammars for which the parsing tables hold action conflicts. In such cases, the algorithm needs a capability of scanning (looking-ahead) next input symbols ahead of the current input symbol. In this paper, a ‘Left to Right’-‘Right Most’ parsing algorithm with lookahead capability is introduced. The 'look-ahead' capability in the LR parsing algorithm is the major contribution of this paper. The practicality of the proposed algorithm is substantiated by the parser implementation of the Context Free Grammar (CFG) of an already proposed programming language 'State Controlled Object Oriented Programming' (SCOOP). SCOOP’s Context Free Grammar has 125 productions and 192 item sets. This algorithm parses SCOOP while the grammar requires to ‘look ahead’ the input symbols due to action conflicts in its parsing table. Proposed LR parsing algorithm with lookahead capability can be viewed as an optimization of ‘Simple Left to Right’-‘Right Most’ (SLR) parsing algorithm.Keywords: left to right-right most parsing, syntax analysis, bottom-up parsing algorithm
Procedia PDF Downloads 126298 Animated Poetry-Film: Poetry in Action
Authors: Linette van der Merwe
Abstract:
It is known that visual artists, performing artists, and literary artists have inspired each other since time immemorial. The enduring, symbiotic relationship between the various art genres is evident where words, colours, lines, and sounds act as metaphors, a physical separation of the transcendental reality of art. Simonides of Keos (c. 556-468 BC) confirmed this, stating that a poem is a talking picture, or, in a more modern expression, a picture is worth a thousand words. It can be seen as an ancient relationship, originating from the epigram (tombstone or artefact inscriptions), the carmen figuratum (figure poem), and the ekphrasis (a description in the form of a poem of a work of art). Visual artists, including Michelangelo, Leonardo da Vinci, and Goethe, wrote poems and songs. Goya, Degas, and Picasso are famous for their works of art and for trying their hands at poetry. Afrikaans writers whose fine art is often published together with their writing, as in the case of Andries Bezuidenhout, Breyten Breytenbach, Sheila Cussons, Hennie Meyer, Carina Stander, and Johan van Wyk, among others, are not a strange phenomenon either. Imitating one art form into another art form is a form of translation, transposition, contemplation, and discovery of artistic impressions, showing parallel interpretations rather than physical comparison. It is especially about the harmony that exists between the different art genres, i.e., a poem that describes a painting or a visual text that portrays a poem that becomes a translation, interpretation, and rediscovery of the verbal text, or rather, from the word text to the image text. Poetry-film, as a form of such a translation of the word text into an image text, can be considered a hybrid, transdisciplinary art form that connects poetry and film. Poetry-film is regarded as an intertwined entity of word, sound, and visual image. It is an attempt to transpose and transform a poem into a new artwork that makes the poem more accessible to people who are not necessarily open to the written word and will, in effect, attract a larger audience to a genre that usually has a limited market. Poetry-film is considered a creative expression of an inverted ekphrastic inspiration, a visual description, interpretation, and expression of a poem. Research also emphasises that animated poetry-film is not widely regarded as a genre of anything and is thus severely under-theorized. This paper will focus on Afrikaans animated poetry-films as a multimodal transposition of a poem text to an animated poetry film, with specific reference to animated poetry-films in Filmverse I (2014) and Filmverse II (2016).Keywords: poetry film, animated poetry film, poetic metaphor, conceptual metaphor, monomodal metaphor, multimodal metaphor, semiotic metaphor, multimodality, metaphor analysis, target domain, source domain
Procedia PDF Downloads 66297 Formulating Rough Approximations in Information Tables with Possibilistic Information
Authors: Michinori Nakata, Hiroshi Sakai
Abstract:
A rough set, which consists of lower and upper approximations, is formulated in information tables containing possibilistic information. First, lower and upper approximations on the basis of possible world semantics in the same way as Lipski did in the field of incomplete databases are shown in order to clarify fundamentals of rough sets under possibilistic information. Possibility and necessity measures are used, as is done in possibilistic databases. As a result, each object has certain and possible membership degrees to lower and upper approximations, which degrees are the lower and upper bounds. Therefore, the degree that the object belongs to lower and upper approximations is expressed by an interval value. And the complementary property linked with the lower and upper approximations holds, as is valid under complete information. Second, the approach based on indiscernibility relations, which is proposed by Dubois and Prade, are extended in three cases. The first case is that objects used to approximate a set of objects are characterized by possibilistic information. The second case is that objects used to approximate a set of objects with possibilistic information are characterized by complete information. The third case is that objects that are characterized by possibilistic information approximate a set of objects with possibilistic information. The extended approach create the same results as the approach based on possible world semantics. This justifies our extension.Keywords: rough sets, possibilistic information, possible world semantics, indiscernibility relations, lower approximations, upper approximations
Procedia PDF Downloads 321