Search results for: graphic analysis
27861 Exploring Disengaging and Engaging Behavior of Doctoral Students
Authors: Salome Schulze
Abstract:
The delay of students in completing their dissertations is a worldwide problem. At the University of South Africa where this research was done, only about a third of the students complete their studies within the required period of time. This study explored the reasons why the students interrupted their studies, and why they resumed their research at a later stage. If this knowledge could be utilised to improve the throughput of doctoral students, it could have significant economic benefits for institutions of higher education while at the same time enhancing their academic prestige. To inform the investigation, attention was given to key theories concerning the learning of doctoral students, namely the situated learning theory, the social capital theory and the self-regulated learning theory, based on the social cognitive theory of learning. Ten students in the faculty of Education were purposefully selected on the grounds of their poor progress, or of having been in the system for too long. The collection of the data was in accordance with a Finnish study, since the two studies had the same aims, namely to investigate student engagement and disengagement. Graphic elicitation interviews, based on visualisations were considered appropriate to collect the data. This method could stimulate the reflection and recall of the participants’ ‘stories’ with very little input from the interviewer. The interviewees were requested to visualise, on paper, their journeys as doctoral students from the time when they first registered. They were to indicate the significant events that occurred and which facilitated their engagement or disengagement. In the interviews that followed, they were requested to elaborate on these motivating or challenging events by explaining when and why they occurred, and what prompted them to resume their studies. The interviews were tape-recorded and transcribed verbatim. Information-rich data were obtained containing visual metaphors. The data indicated that when the students suffered a period of disengagement, it was sometimes related to a lack of self-regulated learning, in particular, a lack of autonomy, and the inability to manage their time effectively. When the students felt isolated from the academic community of practice disengagement also occurred. This included poor guidance by their supervisors, which accordingly deprived them of significant social capital. The study also revealed that situational factors at home or at work were often the main reasons for the students’ procrastinating behaviour. The students, however, remained in the system. They were motivated towards a renewed engagement with their studies if they were self-regulated learners, and if they felt a connectedness with the academic community of practice because of positive relationships with their supervisors and of participation in the activities of the community (e.g., in workshops or conferences). In support of their learning, networking with significant others who were sources of information provided the students with the necessary social capital. Generally, institutions of higher education cannot address the students’ personal issues directly, but they can deal with key institutional factors in order to improve the throughput of doctoral students. It is also suggested that graphic elicitation interviews be used more often in social research that investigates the learning and development of the students.Keywords: doctoral students, engaging and disengaging experiences, graphic elicitation interviews, student procrastination
Procedia PDF Downloads 19327860 Design and Emotion: The Value of 1970s French Children’s Books in the Middle East
Authors: Tina Sleiman
Abstract:
In the early 1970s, a graphics revolution - in quantity and quality - marked the youth publications sector in France. The increased interest in youth publications was supported with the emergence of youth libraries and major publishing houses. In parallel, the 'Agence de Cooperation Culturelle et Technique' (currently the International Organization of the Francophonie) was created, and several Arab countries had joined as members. In spite of political turmoil in the Middle East, French schools in Arab countries were still functioning and some even flourishing. This is a testament that French culture was, and still is, a major export to the region. This study focuses on the aesthetic value of the graphic styles that characterize French children’s books from the 1970s, and their personal value to Francophone people who have consumed these artifacts, in the Middle East. The first part of the study looks at the artifact itself: starting from the context of creation and consumption of these books, and continuing to the preservation and remaining collections. The aesthetic value is studied and compared to similar types of visuals of juxtaposed time periods. The second part examines the audience’s response to the visuals in terms of style recognition or identification, along with emotional significance or associations, and the personal value the artifacts might hold to their consumers. The methods of investigation consist of a literature review, a survey of book collections, and a visual questionnaire, supported by personal interviews. As an outcome, visual patterns will be identified: elements from 1970s children’s books reborn in contemporary youth-based publications. Results of the study shall inform us directly on the aesthetic and personal value of illustrated French children’s books in the Middle East, and indirectly on the capacity of youth-targeted design to create a long-term emotional response from its audience.Keywords: children’s books, French visual culture, graphic style, publication design, revival
Procedia PDF Downloads 16727859 Quality Parameters of Offset Printing Wastewater
Authors: Kiurski S. Jelena, Kecić S. Vesna, Aksentijević M. Snežana
Abstract:
Samples of tap and wastewater were collected in three offset printing facilities in Novi Sad, Serbia. Ten physicochemical parameters were analyzed within all collected samples: pH, conductivity, m - alkalinity, p - alkalinity, acidity, carbonate concentration, hydrogen carbonate concentration, active oxygen content, chloride concentration and total alkali content. All measurements were conducted using the standard analytical and instrumental methods. Comparing the obtained results for tap water and wastewater, a clear quality difference was noticeable, since all physicochemical parameters were significantly higher within wastewater samples. The study also involves the application of simple linear regression analysis on the obtained dataset. By using software package ORIGIN 5 the pH value was mutually correlated with other physicochemical parameters. Based on the obtained values of Pearson coefficient of determination a strong positive correlation between chloride concentration and pH (r = -0.943), as well as between acidity and pH (r = -0.855) was determined. In addition, statistically significant difference was obtained only between acidity and chloride concentration with pH values, since the values of parameter F (247.634 and 182.536) were higher than Fcritical (5.59). In this way, results of statistical analysis highlighted the most influential parameter of water contamination in offset printing, in the form of acidity and chloride concentration. The results showed that variable dependence could be represented by the general regression model: y = a0 + a1x+ k, which further resulted with matching graphic regressions.Keywords: pollution, printing industry, simple linear regression analysis, wastewater
Procedia PDF Downloads 23327858 Authentication of Physical Objects with Dot-Based 2D Code
Authors: Michał Glet, Kamil Kaczyński
Abstract:
Counterfeit goods and documents are a global problem, which needs more and more sophisticated methods of resolving it. Existing techniques using watermarking or embedding symbols on objects are not suitable for all use cases. To address those special needs, we created complete system allowing authentication of paper documents and physical objects with flat surface. Objects are marked using orientation independent and resistant to camera noise 2D graphic codes, named DotAuth. Based on the identifier stored in 2D code, the system is able to perform basic authentication and allows to conduct more sophisticated analysis methods, e.g., relying on augmented reality and physical properties of the object. In this paper, we present the complete architecture, algorithms and applications of the proposed system. Results of the features comparison of the proposed solution and other products are presented as well, pointing to the existence of many advantages that increase usability and efficiency in the means of protecting physical objects.Keywords: anti-forgery, authentication, paper documents, security
Procedia PDF Downloads 13227857 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method
Authors: Wassana Naiyapo, Atichat Sangtong
Abstract:
The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.Keywords: classification tree method, test case, UML use case diagram, use case specification
Procedia PDF Downloads 16127856 I²C Master-Slave Integration
Authors: Rozita Borhan, Lam Kien Sieng
Abstract:
This paper describes I²C Slave implementation using I²C master obtained from the OpenCores website. This website provides free Verilog and VHDL Codes to users. The design implementation for the I²C slave is in Verilog Language and uses EDA tools for ASIC design known as ModelSim from Mentor Graphic. This tool is used for simulation and verification purposes. Common application for this I²C Master-Slave integration is also included. This paper also addresses the advantages and limitations of the said design.Keywords: I²C, master, OpenCores, slave, Verilog, verification
Procedia PDF Downloads 44127855 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 6927854 Using T-Splines to Model Point Clouds from Terrestrial Laser Scanner
Authors: G. Kermarrec, J. Hartmann
Abstract:
Spline surfaces are a major representation of freeform surfaces in the computer-aided graphic industry and were recently introduced in the field of geodesy for processing point clouds from terrestrial laser scanner (TLS). The surface fitting consists of approximating a trustworthy mathematical surface to a large numbered 3D point cloud. The standard B-spline surfaces lack of local refinement due to the tensor-product construction. The consequences are oscillating geometry, particularly in the transition from low-to-high curvature parts for scattered point clouds with missing data. More economic alternatives in terms of parameters on how to handle point clouds with a huge amount of observations are the recently introduced T-splines. As long as the partition of unity is guaranteed, their computational complexity is low, and they are flexible. T-splines are implemented in a commercial package called Rhino, a 3D modeler which is widely used in computer aided design to create and animate NURBS objects. We have applied T-splines surface fitting to terrestrial laser scanner point clouds from a bridge under load and a sheet pile wall with noisy observations. We will highlight their potential for modelling details with high trustworthiness, paving the way for further applications in terms of deformation analysis.Keywords: deformation analysis, surface modelling, terrestrial laser scanner, T-splines
Procedia PDF Downloads 13827853 Parkinson’s Disease Hand-Eye Coordination and Dexterity Evaluation System
Authors: Wann-Yun Shieh, Chin-Man Wang, Ya-Cheng Shieh
Abstract:
This study aims to develop an objective scoring system to evaluate hand-eye coordination and hand dexterity for Parkinson’s disease. This system contains three boards, and each of them is implemented with the sensors to sense a user’s finger operations. The operations include the peg test, the block test, and the blind block test. A user has to use the vision, hearing, and tactile abilities to finish these operations, and the board will record the results automatically. These results can help the physicians to evaluate a user’s reaction, coordination, dexterity function. The results will be collected to a cloud database for further analysis and statistics. A researcher can use this system to obtain systematic, graphic reports for an individual or a group of users. Particularly, a deep learning model is developed to learn the features of the data from different users. This model will help the physicians to assess the Parkinson’s disease symptoms by a more intellective algorithm.Keywords: deep learning, hand-eye coordination, reaction, hand dexterity
Procedia PDF Downloads 6427852 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery
Authors: Evans Belly, Imdad Rizvi, M. M. Kadam
Abstract:
Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery
Procedia PDF Downloads 31227851 Study of the Energy Levels in the Structure of the Laser Diode GaInP
Authors: Abdelali Laid, Abid Hamza, Zeroukhi Houari, Sayah Naimi
Abstract:
This work relates to the study of the energy levels and the optimization of the Parameter intrinsic (a number of wells and their widths, width of barrier of potential, index of refraction etc.) and extrinsic (temperature, pressure) in the Structure laser diode containing the structure GaInP. The methods of calculation used; - method of the empirical pseudo potential to determine the electronic structures of bands, - graphic method for optimization. The found results are in concord with those of the experiment and the theory.Keywords: semi-conductor, GaInP/AlGaInP, pseudopotential, energy, alliages
Procedia PDF Downloads 49027850 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis
Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri
Abstract:
In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer
Procedia PDF Downloads 8427849 Automatic Content Curation of Visual Heritage
Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz
Abstract:
Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research
Procedia PDF Downloads 18327848 An International Curriculum Development for Languages and Technology
Authors: Miguel Nino
Abstract:
When considering the challenges of a changing and demanding globalizing world, it is important to reflect on how university students will be prepared for the realities of internationalization, marketization and intercultural conversation. The present study is an interdisciplinary program designed to respond to the needs of the global community. The proposal bridges the humanities and science through three different fields: Languages, graphic design and computer science, specifically, fundamentals of programming such as python, java script and software animation. Therefore, the goal of the four year program is twofold: First, enable students for intercultural communication between English and other languages such as Spanish, Mandarin, French or German. Second, students will acquire knowledge in practical software and relevant employable skills to collaborate in assisted computer projects that most probable will require essential programing background in interpreted or compiled languages. In order to become inclusive and constructivist, the cognitive linguistics approach is suggested for the three different fields, particularly for languages that rely on the traditional method of repetition. This methodology will help students develop their creativity and encourage them to become independent problem solving individuals, as languages enhance their common ground of interaction for culture and technology. Participants in this course of study will be evaluated in their second language acquisition at the Intermediate-High level. For graphic design and computer science students will apply their creative digital skills, as well as their critical thinking skills learned from the cognitive linguistics approach, to collaborate on a group project design to find solutions for media web design problems or marketing experimentation for a company or the community. It is understood that it will be necessary to apply programming knowledge and skills to deliver the final product. In conclusion, the program equips students with linguistics knowledge and skills to be competent in intercultural communication, where English, the lingua franca, remains the medium for marketing and product delivery. In addition to their employability, students can expand their knowledge and skills in digital humanities, computational linguistics, or increase their portfolio in advertising and marketing. These students will be the global human capital for the competitive globalizing community.Keywords: curriculum, international, languages, technology
Procedia PDF Downloads 44227847 Intersubjectivity of Forensic Handwriting Analysis
Authors: Marta Nawrocka
Abstract:
In each of the legal proceedings, in which expert evidence is carried out, a major concern is the assessment of the evidential value of expert reports. Judicial institutions, while making decisions, rely heavily on the expert reports, because they usually do not possess 'special knowledge' from a certain fields of science which makes it impossible for them to verify the results presented in the processes. In handwriting studies, the standards of analysis are developed. They unify procedures used by experts in comparing signs and in constructing expert reports. However, the methods used by experts are usually of a qualitative nature. They rely on the application of knowledge and experience of expert and in effect give significant range of margin in the assessment. Moreover, the standards used by experts are still not very precise and the process of reaching the conclusions is poorly understood. The above-mentioned circumstances indicate that expert opinions in the field of handwriting analysis, for many reasons, may not be sufficiently reliable. It is assumed that this state of affairs has its source in a very low level of intersubjectivity of measuring scales and analysis procedures, which consist elements of this kind of analysis. Intersubjectivity is a feature of cognition which (in relation to methods) indicates the degree of consistency of results that different people receive using the same method. The higher the level of intersubjectivity is, the more reliable and credible the method can be considered. The aim of the conducted research was to determine the degree of intersubjectivity of the methods used by the experts from the scope of handwriting analysis. 30 experts took part in the study and each of them received two signatures, with varying degrees of readability, for analysis. Their task was to distinguish graphic characteristics in the signature, estimate the evidential value of the found characteristics and estimate the evidential value of the signature. The obtained results were compared with each other using the Alpha Krippendorff’s statistic, which numerically determines the degree of compatibility of the results (assessments) that different people receive under the same conditions using the same method. The estimation of the degree of compatibility of the experts' results for each of these tasks allowed to determine the degree of intersubjectivity of the studied method. The study showed that during the analysis, the experts identified different signature characteristics and attributed different evidential value to them. In this scope, intersubjectivity turned out to be low. In addition, it turned out that experts in various ways called and described the same characteristics, and the language used was often inconsistent and imprecise. Thus, significant differences have been noted on the basis of language and applied nomenclature. On the other hand, experts attributed a similar evidential value to the entire signature (set of characteristics), which indicates that in this range, they were relatively consistent.Keywords: forensic sciences experts, handwriting analysis, inter-rater reliability, reliability of methods
Procedia PDF Downloads 14827846 Hearing Aids Maintenance Training for Hearing-Impaired Preschool Children with the Help of Motion Graphic Tools
Authors: M. Mokhtarzadeh, M. Taheri Qomi, M. Nikafrooz, A. Atashafrooz
Abstract:
The purpose of the present study was to investigate the effectiveness of using motion graphics as a learning medium on training hearing aids maintenance skills to hearing-impaired children. The statistical population of this study consisted of all children with hearing loss in Ahvaz city, at age 4 to 7 years old. As the sample, 60, whom were selected by multistage random sampling, were randomly assigned to two groups; experimental (30 children) and control (30 children) groups. The research method was experimental and the design was pretest-posttest with the control group. The intervention consisted of a 2-minute motion graphics clip to train hearing aids maintenance skills. Data were collected using a 9-question researcher-made questionnaire. The data were analyzed by using one-way analysis of covariance. Results showed that the training of hearing aids maintenance skills with motion graphics was significantly effective for those children. The results of this study can be used by educators, teachers, professionals, and parents to train children with disabilities or normal students.Keywords: hearing aids, hearing aids maintenance skill, hearing impaired children, motion graphics
Procedia PDF Downloads 15627845 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 6527844 Logo Design of Pajamas, OTOP Product of Sainoi Community, Sainoi District, Nonthaburi Province
Authors: Witthaya Mekhum, Napasri Suwanajote, Isara Sangprasert
Abstract:
This research on logo design of pajamas, OTOP product of Sainoi community, Sainoi district, Nonthanuri Province is a participatory action research aiming to find the logo for pajamas, an OTOP product of Sainoi community. Sample of this research is 50 local residents from Sainoi community in Sainoi district, Nonthanuri Province. The questionnaire consisted of 4 main parts. Part 1: factors that influence the decisions of consumers; Part 2: characteristics of the materials used in the design; Part 3: attitude assessment and needs of consumers about logo designing to develop marketing channels; Part 4: suggestions. Interviews were conducted. For data analysis, checklist items were analyzed with frequency and percentage. Open-end items were analyzed by summarizing and using ratio scale and mean and standard deviation. The research results showed that the design, cutting and fabric affect the decision of the consumers. They want design to be decent and beautiful. Illustrations used in graphic design logos should be Lines. Fonts should be English letters and the color of the font should be the same color.Keywords: design, logo, OTOP product, pajamas
Procedia PDF Downloads 26927843 The Development of an Accident Causation Model Specific to Agriculture: The Irish Farm Accident Causation Model
Authors: Carolyn Scott, Rachel Nugent
Abstract:
The agricultural industry in Ireland and worldwide is one of the most dangerous occupations with respect to occupational health and safety accidents and fatalities. Many accident causation models have been developed in safety research to understand the underlying and contributory factors that lead to the occurrence of an accident. Due to the uniqueness of the agricultural sector, current accident causation theories cannot be applied. This paper presents an accident causation model named the Irish Farm Accident Causation Model (IFACM) which has been specifically tailored to the needs of Irish farms. The IFACM is a theoretical and practical model of accident causation that arranges the causal factors into a graphic representation of originating, shaping, and contributory factors that lead to accidents when unsafe acts and conditions are created that are not rectified by control measures. Causes of farm accidents were assimilated by means of a thorough literature review and were collated to form a graphical representation of the underlying causes of a farm accident. The IFACM was validated retrospectively through case study analysis and peer review. Participants in the case study (n=10) identified causes that led to a farm accident in which they were involved. A root cause analysis was conducted to understand the contributory factors surrounding the farm accident, traced back to the ‘root cause’. Experts relevant to farm safety accident causation in the agricultural industry have peer reviewed the IFACM. The accident causation process is complex. Accident prevention requires a comprehensive understanding of this complex process because to prevent the occurrence of accidents, the causes of accidents must be known. There is little research on the key causes and contributory factors of unsafe behaviours and accidents on Irish farms. The focus of this research is to gain a deep understanding of the causality of accidents on Irish farms. The results suggest that the IFACM framework is helpful for the analysis of the causes of accidents within the agricultural industry in Ireland. The research also suggests that there may be international applicability if further research is carried out. Furthermore, significant learning can be obtained from considering the underlying causes of accidents.Keywords: farm safety, farm accidents, accident causation, root cause analysis
Procedia PDF Downloads 7627842 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs
Authors: Wanessa G. Oliveira, Fernando C. Capovilla
Abstract:
Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology
Procedia PDF Downloads 34427841 Undocumented Migrants on the Northern Border of Mexico: Social Imaginary, and Social Representations
Authors: César Enrique Jiménez Yañez, Yessica Martinez Soto
Abstract:
In the present work, the phenomenon of undocumented migration in the northern border of Mexico is analyzed through the graphic representation of the experience of people who migrate in an undocumented way to the United States. 33 of them drew what it meant for them to migrate. Our objective is to analyze the social phenomenon of migration through the drawings of migrants, using the concepts of social imaginary and social representations, identifying the different significant elements with which this symbolically builds their experience. Drawing, as a methodological tool, will help us to understand the migratory experience beyond words.Keywords: Mexico, social imaginary, social representations, undocumented migrants
Procedia PDF Downloads 39927840 Forensic Methods Used for the Verification of the Authenticity of Prints
Authors: Olivia Rybak-Karkosz
Abstract:
This paper aims to present the results of scientific research on methods of forging art prints and their elements, such as signature or provenance and forensic science methods that might be used to verify their authenticity. In the last decades, the art market has observed significant interest in purchasing prints. They are considered an economical alternative to paintings and a considerable investment. However, the authenticity of an art print is difficult to establish as similar visual effects might be achieved with drawings or xerox. The latter is easy to make using a home printer. They are then offered on flea markets or internet auctions as genuine prints. This probable ease of forgery and, at the same time, the difficulty of distinguishing art print techniques were the main reasons why this research was undertaken. A lack of scientific methods dedicated to disclosing a forgery encouraged the author to verify the possibility of using forensic science's methods known and used in other fields of expertise. This research methodology consisted of completing representative forgery samples collected in selected museums based in Poland and a few in Germany and Austria. That allowed the author to present a typology of methods used to forge art prints. Given that one of the most famous graphic design examples is bills and securities, it seems only appropriate to propose in print verification the usage of methods of detecting counterfeit currency. These methods contain an examination of ink, paper, and watermarks. On prints, additionally, signatures and imprints of stamps, etc., are forged as well. So the examination should be completed with handwriting examination and forensic sphragistics. The paper contains a stipulation to conduct a complex analysis of authenticity with the participation of an art restorer, art historian, and forensic expert as head of this team.Keywords: art forgery, examination of an artwork, handwriting analysis, prints
Procedia PDF Downloads 12727839 Theoretical Approaches to Graphic and Formal Generation from Evolutionary Genetics
Authors: Luz Estrada
Abstract:
The currents of evolutionary materialistic thought have argued that knowledge about an object is not obtained through the abstractive method. That is, the object cannot come to be understood if founded upon itself, nor does it take place by the encounter between form and matter. According to this affirmation, the research presented here identified as a problematic situation the absence of comprehension of the formal creation as a generative operation. This has been referred to as a recurrent lack in the production of objects and corresponds to the need to conceive the configurative process from the reality of its genesis. In this case, it is of interest to explore ways of creation that consider the object as if it were a living organism, as well as responding to the object’s experience as embodied in the designer since it unfolds its genesis simultaneously to the ways of existence of those who are involved in the generative experience.Keywords: architecture, theoretical graphics, evolutionary genetics, formal perception
Procedia PDF Downloads 11627838 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations
Authors: Ram Mohan, Richard Haney, Ajit Kelkar
Abstract:
Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance
Procedia PDF Downloads 36127837 The Use of Visual Drawing and Writing Techniques to Elicit Adult Perceptions of Sex Offenders
Authors: Sasha Goodwin
Abstract:
Public perceptions can play a crucial role in influencing criminal justice policy and legislation, particularly concerning sex offenders. Studies have found a proximate relationship between public perception and policy to manage the risks posed by sex offenders. A significant body of research on public perceptions about sex offenders primarily uses survey methods and standardised instruments such as the Community Attitude Towards Sex Offenders (CATSO) and Perceptions of Sex Offenders (PSO) scales and finds a mostly negative and punitive attitude informed by common misconceptions. A transformative methodology from the emerging sub-field of visual criminology is where the construction of offences and offenders are understood via novel ways of collecting and analysing data. This research paper examines the public perceptions of sex offenders through the utilization of a content analysis of drawings. The study aimed to disentangle the emotions, stereotypes, and myths embedded in public perceptions by analysing the graphic representations and specific characteristics depicted by participants. Preliminary findings highlight significant discrepancies between public perceptions and empirical profiles of sex offenders, shedding light on the misunderstandings surrounding this heterogeneous group. By employing visual data, this research contributes to a deeper understanding of the complex interplay between societal perceptions and the realities of sex offenders.Keywords: emotions, figural drawings, public perception, sex offenders
Procedia PDF Downloads 6827836 The Lexicographic Serial Rule
Authors: Thi Thao Nguyen, Andrew McLennan, Shino Takayama
Abstract:
We study the probabilistic allocation of finitely many indivisible objects to finitely many agents. Well known allocation rules for this problem include random priority, the market mechanism proposed by Hylland and Zeckhauser [1979], and the probabilistic serial rule of Bogomolnaia and Moulin [2001]. We propose a new allocation rule, which we call the lexico-graphic (serial) rule, that is tailored for situations in which each agent's primary concern is to maximize the probability of receiving her favourite object. Three axioms, lex efficiency, lex envy freeness and fairness, are proposed and fully characterize the lexicographic serial rule. We also discuss how our axioms and the lexicographic rule are related to other allocation rules, particularly the probabilistic serial rule.Keywords: Efficiency, Envy free, Lexicographic, Probabilistic Serial Rule
Procedia PDF Downloads 14527835 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality
Authors: David Sinfield, Thomas Cochrane, Marcos Steagall
Abstract:
While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications
Procedia PDF Downloads 23727834 The Portrayal of Violence Against Women in Bangladesh News Media: Seeing It Through Rumana Manzur’s Case
Authors: Zerrin Akter Anni
Abstract:
The media's role in shaping perceptions of violence against women (VAW) and their portrayal in news reporting significantly influences our understanding of this critical issue. My research delves into the portrayal of violence against women in mainstream media, using the prominent case of Dr. Rumana Manzur, a former UBC Fulbright Scholar from Bangladesh who suffered a brutal assault by her ex-husband in June 2011. Employing a qualitative research approach, this study uses an ethnographic media analysis method to scrutinize news reports of the aforementioned case from selected newspapers in Bangladesh. The primary objectives are to investigate how the popular news media in Bangladesh addresses the issue of violence against women and frames the victims of such violence. The findings of this research highlight that news media can perpetuate gender stereotypes and subtly shift blame onto the victim through various techniques, creating intricate interactions between the reader and the text. These techniques include sensationalized headlines, textual content, and graphic images. This victim-blaming process not only retraumatizes the survivor but also distorts the actual facts when presenting the case to a larger audience. Consequently, the representation of violence against women cases in media, particularly the portrayal of women as victims during reporting, significantly impacts our collective comprehension of this issue. In conclusion, this paper asserts that the Bangladeshi media, particularly news outlets, in conjunction with society, continue to follow a pattern of depicting gender-based violence in ways that devalue the image of women. This research underscores the need for critical analysis of media representations of violence against women cases, as they can perpetuate harmful stereotypes and hinder efforts to combat this pervasive problem. Therefore, the outcome of this research is to comprehend the complex dynamics between media and violence against women, which is essential for fostering a more empathetic and informed society that actively works towards eradicating this problem from our society.Keywords: media representation, violence against women (vaw), ethnographic media analysis, victim-blaming, sensationalized headline
Procedia PDF Downloads 7327833 The Impact of Technology on Architecture and Graphic Designs
Authors: Feby Zaki Raouf Fawzy
Abstract:
Nowadays, design and architecture are being affected and undergoing change with the rapid advancements in technology, economics, politics, society, and culture. Architecture has been transforming with the latest developments after the inclusion of computers in design. Integration of design into the computational environment has revolutionized architecture and unique perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within the historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology is supported by a detailed literature review, and they are consolidated with the examination of focal points of 20th-century architecture under the titles parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present, the developments in architecture cannot keep up with the advancements in technology, and recent developments in technology overshadow architecture; even technology decides the direction of architecture. As a result, a scenario is presented with regard to the reach of technology in the future of architecture and the role of the architect.Keywords: design and development the information technology architecture, enterprise architecture, enterprise architecture design result, TOGAF architecture development method (ADM)
Procedia PDF Downloads 6827832 Principles and Practice of Therapeutic Architecture
Authors: Umedov Mekhroz, Griaznova Svetlana
Abstract:
The quality of life and well-being of patients, staff and visitors are central to the delivery of health care. Architecture and design are becoming an integral part of the healing and recovery approach. The most significant point that can be implemented in hospital buildings is the therapeutic value of the artificial environment, the design and integration of plants to bring the natural world into the healthcare environment. The hospital environment should feel like home comfort. The techniques that therapeutic architecture uses are very cheap, but provide real benefit to patients, staff and visitors, demonstrating that the difference is not in cost but in design quality. The best environment is not necessarily more expensive - it is about special use of light and color, rational use of materials and flexibility of premises. All this forms innovative concepts in modern hospital architecture, in new construction, renovation or expansion projects. The aim of the study is to identify the methods and principles of therapeutic architecture. The research methodology consists in studying and summarizing international experience in scientific research, literature, standards, methodological manuals and project materials on the research topic. The result of the research is the development of graphic-analytical tables based on the system analysis of the processed information; 3d visualization of hospital interiors based on processed information.Keywords: therapeutic architecture, healthcare interiors, sustainable design, materials, color scheme, lighting, environment.
Procedia PDF Downloads 123