Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12744

Search results for: computer processing of large databases

9174 Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models

Authors: Morten Brøgger, Kim Wittchen

Abstract:

Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.

Keywords: building stock energy modelling, energy-savings, archetype

Procedia PDF Downloads 154
9173 Barriers to Access among Indigenous Women Seeking Prenatal Care: A Literature Review

Authors: Zarish Jawad, Nikita Chugh, Karina Dadar

Abstract:

Introduction: This paper aims to identify barriers indigenous women face in accessing prenatal care in Canada. It explores the differences in prenatal care received between indigenous and non-indigenous women. The objective is to look at changes or programs in Canada's healthcare system to reduce barriers to accessing safe prenatal care for indigenous women. Methods: A literature search of 12 papers was conducted using the following databases: PubMed, Medline, OVID, Google Scholar, and ScienceDirect. The studies included were written in English only, including indigenous females between the age of 19-35, and review articles were excluded. Participants in the studies examined did not have any severe underlying medical conditions for the duration of the study, and study designs included in the review are prospective cohort, cross-sectional, case report, and case-control studies. Results: Among all the barriers Indigenous women face in accessing prenatal care, the three most significant barriers Indigenous women face include a lack of culturally safe prenatal care, lack of services in the Indigenous community, proximity of prenatal facilities to Indigenous communities and costs of transportation. Discussion: The study found three significant barriers indigenous women face in accessing prenatal care in Canada; the geographical distribution of healthcare facilities, distrust between patients and healthcare professionals, and cultural sensitivity. Some of the suggested solutions include building more birthing and prenatal care facilities in rural areas for indigenous women, educating healthcare professionals on culturally sensitive healthcare, and involving indigenous people in the decision-making process to reduce distrust and power imbalances. Conclusion: The involvement of indigenous women and community leaders is important in making decisions regarding the implementation of effective healthcare and prenatal programs for indigenous women. However, further research is required to understand the effectiveness of the solutions and the barriers that make prenatal care less accessible for indigenous women in Canada.

Keywords: indigenous, maternal health, prenatal care, barriers

Procedia PDF Downloads 152
9172 Window Opening Behavior in High-Density Housing Development in Subtropical Climate

Authors: Minjung Maing, Sibei Liu

Abstract:

This research discusses the results of a study of window opening behavior of large housing developments in the high-density megacity of Hong Kong. The methods used for the study involved field observations using photo documentation of the four cardinal elevations (north, south-east, and west) of two large housing developments in a very dense urban area of approx. 46,000 persons per square meter within the city of Hong Kong. The targeted housing developments (A and B) are large public housing with a population of about 13,000 in each development of lower income. However, the mean income level in development A is about 40% higher than development B and home ownership is 60% in development A and 0% in development B. Mapping of the surrounding amenities and layout of the developments were also studied to understand the available activities to the residents. The photo documentation of the elevations was taken from November 2016 to February 2018 to gather a full spectrum of different seasons and both in the morning and afternoon (am/pm) times. From the photograph, the window opening behavior was measured by counting the amount of windows opened as a percentage of all the windows on that façade. For each date of survey data collected, weather data was recorded from weather stations located in the same region to collect temperature, humidity and wind speed. To further understand the behavior, simulation studies of microclimate conditions of the housing development was conducted using the software ENVI-met, a widely used simulation tool by researchers studying urban climate. Four major conclusions can be drawn from the data analysis and simulation results. Firstly, there is little change in the amount of window opening during the different seasons within a temperature range of 10 to 35 degrees Celsius. This means that people who tend to open their windows have consistent window opening behavior throughout the year and high tolerance of indoor thermal conditions. Secondly, for all four elevations the lower-income development B opened more windows (almost two times more units) than higher-income development A meaning window opening behavior had strong correlations with income level. Thirdly, there is a lack of correlation between outdoor horizontal wind speed and window opening behavior, as the changes of wind speed do not seem to affect the action of opening windows in most conditions. Similar to the low correlation between horizontal wind speed and window opening percentage, it is found that vertical wind speed also cannot explain the window opening behavior of occupants. Fourthly, there is a slightly higher average of window opening on the south elevation than the north elevation, which may be due to the south elevation being well shaded from high angle sun during the summer and allowing heat into units from lower angle sun during the winter season. These findings are important to providing insight into how to better design urban environments and indoor thermal environments for a liveable high density city.

Keywords: high-density housing, subtropical climate, urban behavior, window opening

Procedia PDF Downloads 125
9171 Evaluation of the Gasification Process for the Generation of Syngas Using Solid Waste at the Autónoma de Colombia University

Authors: Yeraldin Galindo, Soraida Mora

Abstract:

Solid urban waste represents one of the largest sources of global environmental pollution due to the large quantities of these that are produced every day; thus, the elimination of such waste is a major problem for the environmental authorities who must look for alternatives to reduce the volume of waste with the possibility of obtaining an energy recovery. At the Autónoma de Colombia University, approximately 423.27 kg/d of solid waste are generated mainly paper, cardboard, and plastic. A large amount of these solid wastes has as final disposition the sanitary landfill of the city, wasting the energy potential that these could have, this, added to the emissions generated by the collection and transport of the same, has as consequence the increase of atmospheric pollutants. One of the alternative process used in the last years to generate electrical energy from solid waste such as paper, cardboard, plastic and, mainly, organic waste or biomass to replace the use of fossil fuels is the gasification. This is a thermal conversion process of biomass. The objective of it is to generate a combustible gas as the result of a series of chemical reactions propitiated by the addition of heat and the reaction agents. This project was developed with the intention of giving an energetic use to the waste (paper, cardboard, and plastic) produced inside the university, using them to generate a synthesis gas with a gasifier prototype. The gas produced was evaluated to determine their benefits in terms of electricity generation or raw material for the chemical industry. In this process, air was used as gasifying agent. The characterization of the synthesis gas was carried out by a gas chromatography carried out by the Chemical Engineering Laboratory of the National University of Colombia. Taking into account the results obtained, it was concluded that the gas generated is of acceptable quality in terms of the concentration of its components, but it is a gas of low calorific value. For this reason, the syngas generated in this project is not viable for the production of electrical energy but for the production of methanol transformed by the Fischer-Tropsch cycle.

Keywords: alternative energies, gasification, gasifying agent, solid urban waste, syngas

Procedia PDF Downloads 258
9170 Examining Employee Social Intrapreneurial Behaviour (ESIB) in Kuwait: Pilot Study

Authors: Ardita Malaj, Ahmad R. Alsaber, Bedour Alboloushi, Anwaar Alkandari

Abstract:

Organizations worldwide, particularly in Kuwait, are concerned with implementing a progressive workplace culture and fostering social innovation behaviours. The main aim of this research is to examine and establish a thorough comprehension of the relationship between an inventive organizational culture, employee intrapreneurial behaviour, authentic leadership, employee job satisfaction, and employee job commitment in the manufacturing sector of Kuwait, which is a developed economy. Literature reviews analyse the core concepts and their related areas by scrutinizing their definitions, dimensions, and importance to uncover any deficiencies in existing research. The examination of relevant research uncovered major gaps in understanding. This study examines the reliability and validity of a newly developed questionnaire designed to identify the appropriate applications for a large-scale investigation. A preliminary investigation was carried out, determining a sample size of 36 respondents selected randomly from a pool of 223 samples. SPSS was utilized to calculate the percentages of the demographic characteristics for the participants, assess the credibility of the measurements, evaluate the internal consistency, validate all agreements, and determine Pearson's correlation. The study's results indicated that the majority of participants were male (66.7%), aged between 35 and 44 (38.9%), and possessed a bachelor's degree (58.3%). Approximately 94.4% of the participants were employed full-time. 72.2% of the participants are employed in the electrical, computer, and ICT sector, whilst 8.3% work in the metal industry. Out of all the departments, the human resource department had the highest level of engagement, making up 13.9% of the total. Most participants (36.1%) possessed intermediate or advanced levels of experience, whilst 21% were classified as entry-level. Furthermore, 8.3% of individuals were categorized as first-level management, 22.2% were categorized as middle management, and 16.7% were categorized as executive or senior management. Around 19.4% of the participants have over a decade of professional experience. The Pearson's correlation coefficient for all 5 components varies between 0.4009 to 0.7183. The results indicate that all elements of the questionnaire were effectively verified, with a Cronbach alpha factor predominantly exceeding 0.6, which is the criterion commonly accepted by researchers. Therefore, the work on the larger scope of testing and analysis could continue.

Keywords: pilot study, ESIB, innovative organizational culture, Kuwait, validation

Procedia PDF Downloads 32
9169 Coffee Consumption and Glucose Metabolism: a Systematic Review of Clinical Trials

Authors: Caio E. G. Reis, Jose G. Dórea, Teresa H. M. da Costa

Abstract:

Objective: Epidemiological data shows an inverse association of coffee consumption with risk of type 2 diabetes mellitus. However, the clinical effects of coffee consumption on the glucose metabolism biomarkers remain controversial. Thus, this paper reviews clinical trials that evaluated the effects of coffee consumption on glucose metabolism. Research Design and Methods: We identified studies published until December 2014 by searching electronic databases and reference lists. We included randomized clinical trials which the intervention group received caffeinated and/or decaffeinated coffee and the control group received water or placebo treatments and measured biomarkers of glucose metabolism. The Jadad Score was applied to evaluate the quality of the studies whereas studies that scored ≥ 3 points were considered for the analyses. Results: Seven clinical trials (total of 237 subjects) were analyzed involving adult healthy, overweight and diabetic subjects. The studies were divided in short-term (1 to 3h) and long-term (2 to 16 weeks) duration. The results for short-term studies showed that caffeinated coffee consumption may increase the area under the curve for glucose response, while for long-term studies caffeinated coffee may improve the glycemic metabolism by reducing the glucose curve and increasing insulin response. These results seem to show that the benefits of coffee consumption occur in the long-term as has been shown in the reduction of type 2 diabetes mellitus risk in epidemiological studies. Nevertheless, until the relationship between long-term coffee consumption and type 2 diabetes mellitus is better understood and any mechanism involved identified, it is premature to make claims about coffee preventing type 2 diabetes mellitus. Conclusion: The findings suggest that caffeinated coffee may impairs glucose metabolism in short-term but in the long-term the studies indicate reduction of type 2 diabetes mellitus risk. More clinical trials with comparable methodology are needed to unravel this paradox.

Keywords: coffee, diabetes mellitus type 2, glucose, insulin

Procedia PDF Downloads 466
9168 Banking Union: A New Step towards Completing the Economic and Monetary Union

Authors: Marijana Ivanov, Roman Šubić

Abstract:

The single rulebook together with the Single Supervisory Mechanism and the Single Resolution Mechanism - as two main pillars of the banking union, represent important steps towards completing the Economic and Monetary Union. It should provide a consistent application of common rules and administrative standards for supervision, recovery and resolution of banks – with the final aim that a former practice of the bail-out is replaced with the bail-in system through which bank failures will be resolved by their own funds, i.e. with minimal costs for taxpayers and real economy. It has to reduce the financial fragmentation recorded in the years of crisis as the result of divergent behaviors in risk premium, lending activities, and interest rates between the core and the periphery. In addition, it should strengthen the effectiveness of monetary transmission channels, in particular the credit channels and overflows of liquidity on the single interbank money market. However, contrary to all the positive expectations related to the future functioning of the banking union, low and unbalanced economic growth rates remain a challenge for the maintenance of financial stability in the euro area, and this problem cannot be resolved just by a single supervision. In many countries bank assets exceed their GDP by several times, and large banks are still a matter of concern because of their systemic importance for individual countries and the euro zone as a whole. The creation of the SSM and the SRM should increase transparency of the banking system in the euro area and restore confidence that have been disturbed during the depression. It would provide a new opportunity to strengthen economic and financial systems in the peripheral countries. On the other hand, there is a potential threat that future focus of the ECB, resolution mechanism and other relevant institutions will be extremely oriented to the large and significant banks (whereby one half of them operate in the core and most important euro area countries), while it is questionable to what extent the common resolution funds will be used for rescue of less important institutions.

Keywords: banking union, financial integration, single supervision mechanism (SSM)

Procedia PDF Downloads 470
9167 Translanguaging and Cross-languages Analyses in Writing and Oral Production with Multilinguals: a Systematic Review

Authors: Maryvone Cunha de Morais, Lilian Cristine Hübner

Abstract:

Based on a translanguaging theoretical approach, which considers language not as separate entities but as an entire repertoire available to bilingual individuals, this systematic review aimed at analyzing the methods (aims, samples investigated, type of stimuli, and analyses) adopted by studies on translanguaging practices associated with written and oral tasks (separately or integrated) in bilingual education. The PRISMA criteria for systematic reviews were adopted, with the descriptors "translanguaging", "bilingual education" and/or “written and oral tasks" to search in Pubmed/Medline, Lilacs, Eric, Scopus, PsycINFO, and Web of Science databases for articles published between 2017 and 2021. 280 registers were found, and after following the inclusion/exclusion criteria, 24 articles were considered for this analysis. The results showed that translanguaging practices were investigated on four studies focused on written production analyses, ten focused on oral production analysis, whereas ten studies focused on both written and oral production analyses. The majority of the studies followed a qualitative approach, while five studies have attempted to study translanguaging with quantitative statistical measures. Several types of methods were used to investigate translanguaging practices in written and oral production, with different approaches and tools indicating that the methods are still in development. Moreover, the findings showed that students’ interactions have received significant attention, and studies have been developed not just in language classes in bilingual education, but also including diverse educational and theoretical contexts such as Content and Language Integrated Learning, task repetition, Science classes, collaborative writing, storytelling, peer feedback, Speech Act theory and collective thinking, language ideologies, conversational analysis, and discourse analyses. The studies, whether focused either on writing or oral tasks or in both, have portrayed significant research and pedagogical implications, grounded on the view of integrated languages in bi-and multilinguals.

Keywords: bilingual education, oral production, translanguaging, written production

Procedia PDF Downloads 126
9166 Effectiveness of Technology Enhanced Learning in Orthodontic Teaching

Authors: Mohammed Shaath

Abstract:

Aims Technological advancements in teaching and learning have made significant improvements over the past decade and have been incorporated in institutions to aid the learner’s experience. This review aims to assess whether Technology Enhanced Learning (TEL) pedagogy is more effective at improving students’ attitude and knowledge retention in orthodontic training than traditional methods. Methodology The searches comprised Systematic Reviews (SRs) related to the comparison of TEL and traditional teaching methods from the following databases: PubMed, SCOPUS, Medline, and Embase. One researcher performed the screening, data extraction, and analysis and assessed the risk of bias and quality using A Measurement Tool to Assess Systematic Reviews 2 (AMSTAR-2). Kirkpatrick’s 4-level evaluation model was used to evaluate the educational values. Results A sum of 34 SRs was identified after the removal of duplications and irrelevant SRs; 4 fit the inclusion criteria. On Level 1, students showed positivity to TEL methods, although acknowledging that the harder the platforms to use, the less favourable. Nonetheless, the students still showed high levels of acceptability. Level 2 showed there is no significant overall advantage of increased knowledge when it comes to TEL methods. One SR showed that certain aspects of study within orthodontics deliver a statistical improvement with TEL. Level 3 was the least reported on. Results showed that if left without time restrictions, TEL methods may be advantageous. Level 4 shows that both methods are equally as effective, but TEL has the potential to overtake traditional methods in the future as a form of active, student-centered approach. Conclusion TEL has a high level of acceptability and potential to improve learning in orthodontics. Current reviews have potential to be improved, but the biggest aspect that needs to be addressed is the primary study, which shows a lower level of evidence and heterogeneity in their results. As it stands, the replacement of traditional methods with TEL cannot be fully supported in an evidence-based manner. The potential of TEL methods has been recognized and is already starting to show some evidence of the ability to be more effective in some aspects of learning to cater for a more technology savvy generation.

Keywords: TEL, orthodontic, teaching, traditional

Procedia PDF Downloads 42
9165 Structured Cross System Planning and Control in Modular Production Systems by Using Agent-Based Control Loops

Authors: Simon Komesker, Achim Wagner, Martin Ruskowski

Abstract:

In times of volatile markets with fluctuating demand and the uncertainty of global supply chains, flexible production systems are the key to an efficient implementation of a desired production program. In this publication, the authors present a holistic information concept taking into account various influencing factors for operating towards the global optimum. Therefore, a strategy for the implementation of multi-level planning for a flexible, reconfigurable production system with an alternative production concept in the automotive industry is developed. The main contribution of this work is a system structure mixing central and decentral planning and control evaluated in a simulation framework. The information system structure in current production systems in the automotive industry is rigidly hierarchically organized in monolithic systems. The production program is created rule-based with the premise of achieving uniform cycle time. This program then provides the information basis for execution in subsystems at the station and process execution level. In today's era of mixed-(car-)model factories, complex conditions and conflicts arise in achieving logistics, quality, and production goals. There is no provision for feedback loops of results from the process execution level (resources) and process supporting (quality and logistics) systems and reconsideration in the planning systems. To enable a robust production flow, the complexity of production system control is artificially reduced by the line structure and results, for example in material-intensive processes (buffers and safety stocks - two container principle also for different variants). The limited degrees of freedom of line production have produced the principle of progress figure control, which results in one-time sequencing, sequential order release, and relatively inflexible capacity control. As a result, modularly structured production systems such as modular production according to known approaches with more degrees of freedom are currently difficult to represent in terms of information technology. The remedy is an information concept that supports cross-system and cross-level information processing for centralized and decentralized decision-making. Through an architecture of hierarchically organized but decoupled subsystems, the paradigm of hybrid control is used, and a holonic manufacturing system is offered, which enables flexible information provisioning and processing support. In this way, the influences from quality, logistics, and production processes can be linked holistically with the advantages of mixed centralized and decentralized planning and control. Modular production systems also require modularly networked information systems with semi-autonomous optimization for a robust production flow. Dynamic prioritization of different key figures between subsystems should lead the production system to an overall optimum. The tasks and goals of quality, logistics, process, resource, and product areas in a cyber-physical production system are designed as an interconnected multi-agent-system. The result is an alternative system structure that executes centralized process planning and decentralized processing. An agent-based manufacturing control is used to enable different flexibility and reconfigurability states and manufacturing strategies in order to find optimal partial solutions of subsystems, that lead to a near global optimum for hybrid planning. This allows a robust near to plan execution with integrated quality control and intralogistics.

Keywords: holonic manufacturing system, modular production system, planning, and control, system structure

Procedia PDF Downloads 169
9164 About the Case Portfolio Management Algorithms and Their Applications

Authors: M. Chumburidze, N. Salia, T. Namchevadze

Abstract:

This work deal with case processing problems in business. The task of strategic credit requirements management of cases portfolio is discussed. The information model of credit requirements in a binary tree diagram is considered. The algorithms to solve issues of prioritizing clusters of cases in business have been investigated. An implementation of priority queues to support case management operations has been presented. The corresponding pseudo codes for the programming application have been constructed. The tools applied in this development are based on binary tree ordering algorithms, optimization theory, and business management methods.

Keywords: credit network, case portfolio, binary tree, priority queue, stack

Procedia PDF Downloads 150
9163 Study of Inhibition of the End Effect Based on AR Model Predict of Combined Data Extension and Window Function

Authors: Pan Hongxia, Wang Zhenhua

Abstract:

In this paper, the EMD decomposition in the process of endpoint effect adopted data based on AR model to predict the continuation and window function method of combining the two effective inhibition. Proven by simulation of the simulation signal obtained the ideal effect, then, apply this method to the gearbox test data is also achieved good effect in the process, for the analysis of the subsequent data processing to improve the calculation accuracy. In the end, under various working conditions for the gearbox fault diagnosis laid a good foundation.

Keywords: gearbox, fault diagnosis, ar model, end effect

Procedia PDF Downloads 366
9162 Development of Technologies for Biotransformation of Aquatic Biological Resources for the Production of Functional, Specialized, Therapeutic, Preventive, and Microbiological Products

Authors: Kira Rysakova, Vitaly Novikov

Abstract:

An improved method of obtaining enzymatic collagen hydrolysate from the tissues of marine hydrobionts is proposed, which allows to obtain hydrolysate without pre-isolation of pure collagen. The method can be used to isolate enzymatic collagen hydrolysate from the waste of industrial processing of Red King crab and non-traditional objects - marine holothurias. Comparative analysis of collagen hydrolysates has shown the possibility of their use in a number of nutrient media, but this requires additional optimization of their composition and biological tests on wide sets of test strains of microorganisms.

Keywords: collagen hydrolysate, marine hydrobionts, red king crab, marine holothurias, enzymes, exclusive HPLC

Procedia PDF Downloads 169
9161 Investigation of Oscillation Mechanism of a Large-scale Solar Photovoltaic and Wind Hybrid Power Plant

Authors: Ting Kai Chia, Ruifeng Yan, Feifei Bai, Tapan Saha

Abstract:

This research presents a real-world power system oscillation incident in 2022 originated by a hybrid solar photovoltaic (PV) and wind renewable energy farm with a rated capacity of approximately 300MW in Australia. The voltage and reactive power outputs recorded at the point of common coupling (PCC) oscillated at a sub-synchronous frequency region, which sustained for approximately five hours in the network. The reactive power oscillation gradually increased over time and reached a recorded maximum of approximately 250MVar peak-to-peak (from inductive to capacitive). The network service provider was not able to quickly identify the location of the oscillation source because the issue was widespread across the network. After the incident, the original equipment manufacturer (OEM) concluded that the oscillation problem was caused by the incorrect setting recovery of the hybrid power plant controller (HPPC) in the voltage and reactive power control loop after a loss of communication event. The voltage controller normally outputs a reactive (Q) reference value to the Q controller which controls the Q dispatch setpoint of PV and wind plants in the hybrid farm. Meanwhile, a feed-forward (FF) configuration is used to bypass the Q controller in case there is a loss of communication. Further study found that the FF control mode was still engaged when communication was re-established, which ultimately resulted in the oscillation event. However, there was no detailed explanation of why the FF control mode can cause instability in the hybrid farm. Also, there was no duplication of the event in the simulation to analyze the root cause of the oscillation. Therefore, this research aims to model and replicate the oscillation event in a simulation environment and investigate the underlying behavior of the HPPC and the consequent oscillation mechanism during the incident. The outcome of this research will provide significant benefits to the safe operation of large-scale renewable energy generators and power networks.

Keywords: PV, oscillation, modelling, wind

Procedia PDF Downloads 37
9160 The Platform for Digitization of Georgian Documents

Authors: Erekle Magradze, Davit Soselia, Levan Shughliashvili, Irakli Koberidze, Shota Tsiskaridze, Victor Kakhniashvili, Tamar Chaghiashvili

Abstract:

Since the beginning of active publishing activity in Georgia, voluminous printed material has been accumulated, the digitization of which is an important task. Digitized materials will be available to the audience, and it will be possible to find text in them and conduct various factual research. Digitizing scanned documents means scanning documents, extracting text from the scanned documents, and processing the text into a corresponding language model to detect inaccuracies and grammatical errors. Implementing these stages requires a unified, scalable, and automated platform, where the digital service developed for each stage will perform the task assigned to it; at the same time, it will be possible to develop these services dynamically so that there is no interruption in the work of the platform.

Keywords: NLP, OCR, BERT, Kubernetes, transformers

Procedia PDF Downloads 145
9159 Determination of Myocardial Function Using Heart Accumulated Radiopharmaceuticals

Authors: C. C .D. Kulathilake, M. Jayatilake, T. Takahashi

Abstract:

The myocardium is composed of specialized muscle which relies mainly on fatty acid and sugar metabolism and it is widely contribute to the heart functioning. The changes of the cardiac energy-producing system during heart failure have been proved using autoradiography techniques. This study focused on evaluating sugar and fatty acid metabolism in myocardium as cardiac energy getting system using heart-accumulated radiopharmaceuticals. Two sets of autoradiographs of heart cross sections of Lewis male rats were analyzed and the time- accumulation curve obtained with use of the MATLAB image processing software to evaluate fatty acid and sugar metabolic functions.

Keywords: autoradiographs, fatty acid, radiopharmaceuticals, sugar

Procedia PDF Downloads 451
9158 Refined Edge Detection Network

Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni

Abstract:

Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.

Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone

Procedia PDF Downloads 102
9157 Using Audio-Visual Aids and Computer-Assisted Language Instruction (CALI) to Overcome Learning Difficulties of Listening in Students of Special Needs

Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Ayman Al Yaari, Montaha Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Fatehi Eissa

Abstract:

Background & Aims: Audio-visual aids and computer-aided language instruction (CALI) have been documented to improve receptive skills, namely listening skills, in normal students. The increased listening has been attributed to the understanding of other interlocutors' speech, but recent experiments have suggested that audio-visual aids and CALI should be tested against the listening of students of special needs to see the effects of the former in the latter. This investigation described the effect of audio-visual aids and CALI on the performance of these students. Methods: Pre-and-posttests were administered to 40 students of special needs of both sexes at al-Malādh school for students of special needs aged between 8 and 18 years old. A comparison was held between this group of students and another similar group (control group). Whereas the former group underwent a listening course using audio-visual aids and CALI, the latter studied the same course with the same speech language therapist (SLT) with the classical method. The outcomes of the two tests for the two groups were qualitatively and quantitatively analyzed. Results: Significant improvement in the performance was found in the first group (treatment group) (posttest= 72.45% vs. pre-test= 25.55%) in comparison to the second (control) (posttest= 25.55% vs. pre-test= 23.72%). In comparison to the males’ scores, the scores of females are higher (1487 scores vs. 1411 scores). Suggested results support the necessity of the use of audio-visual aids and CALI in teaching listening at the schools of students of special needs.

Keywords: listening, receptive skills, audio-visual aids, CALI, special needs

Procedia PDF Downloads 48
9156 High Level Expression of Fluorinase in Escherichia Coli and Pichia Pastoris

Authors: Lee A. Browne, K. Rumbold

Abstract:

The first fluorinating enzyme, 5'-fluoro-5'-deoxyadenosine synthase (fluorinase) was isolated from the soil bacterium Streptomyces cattleya. Such an enzyme, with the ability to catalyze a C-F bond, presents great potential as a biocatalyst. Naturally fluorinated compounds are extremely rare in nature. As a result, the number of fluorinases identified remains relatively few. The field of fluorination is almost completely synthetic. However, with the increasing demand for fluorinated organic compounds of commercial value in the agrochemical, pharmaceutical and materials industries, it has become necessary to utilize biologically based methods such as biocatalysts. A key step in this crucial process is the large-scale production of the fluorinase enzyme in considerable quantities for industrial applications. Thus, this study aimed to optimize expression of the fluorinase enzyme in both prokaryotic and eukaryotic expression systems in order to obtain high protein yields. The fluorinase gene was cloned into the pET 41b(+) and pPinkα-HC vectors and used to transform the expression hosts, E.coli BL21(DE3) and Pichia pastoris (PichiaPink™ strains) respectively. Expression trials were conducted to select optimal conditions for expression in both expression systems. Fluorinase catalyses a reaction between S-adenosyl-L-Methionine (SAM) and fluoride ion to produce 5'-fluorodeoxyadenosine (5'FDA) and L-Methionine. The activity of the enzyme was determined using HPLC by measuring the product of the reaction 5'FDA. A gradient mobile phase of 95:5 v/v 50mM potassium phosphate buffer to a final mobile phase containing 80:20 v/v 50mM potassium phosphate buffer and acetonitrile were used. This resulted in the complete separation of SAM and 5’-FDA which eluted at 1.3 minutes and 3.4 minutes respectively. This proved that the fluorinase enzyme was active. Optimising expression of the fluorinase enzyme was successful in both E.coli and PichiaPink™ where high expression levels in both expression systems were achieved. Protein production will be scaled up in PichiaPink™ using fermentation to achieve large-scale protein production. High level expression of protein is essential in biocatalysis for the availability of enzymes for industrial applications.

Keywords: biocatalyst, expression, fluorinase, PichiaPink™

Procedia PDF Downloads 552
9155 The State of Oral Health after COVID-19 Lockdown: A Systematic Review

Authors: Faeze omid, Morteza Banakar

Abstract:

Background: The COVID-19 pandemic has had a significant impact on global health and healthcare systems, including oral health. The lockdown measures implemented in many countries have led to changes in oral health behaviors, access to dental care, and the delivery of dental services. However, the extent of these changes and their effects on oral health outcomes remains unclear. This systematic review aims to synthesize the available evidence on the state of oral health after the COVID-19 lockdown. Methods: We conducted a systematic search of electronic databases (PubMed, Embase, Scopus, and Web of Science) and grey literature sources for studies reporting on oral health outcomes after the COVID-19 lockdown. We included studies published in English between January 2020 and March 2023. Two reviewers independently screened the titles, abstracts, and full texts of potentially relevant articles and extracted data from included studies. We used a narrative synthesis approach to summarize the findings. Results: Our search identified 23 studies from 12 countries, including cross-sectional surveys, cohort studies, and case reports. The studies reported on changes in oral health behaviors, access to dental care, and the prevalence and severity of dental conditions after the COVID-19 lockdown. Overall, the evidence suggests that the lockdown measures had a negative impact on oral health outcomes, particularly among vulnerable populations. There were decreases in dental attendance, increases in dental anxiety and fear, and changes in oral hygiene practices. Furthermore, there were increases in the incidence and severity of dental conditions, such as dental caries and periodontal disease, and delays in the diagnosis and treatment of oral cancers. Conclusion: The COVID-19 pandemic and associated lockdown measures have had significant effects on oral health outcomes, with negative impacts on oral health behaviors, access to care, and the prevalence and severity of dental conditions. These findings highlight the need for continued monitoring and interventions to address the long-term effects of the pandemic on oral health.

Keywords: COVID-19, oral health, systematic review, dental public health

Procedia PDF Downloads 80
9154 Characterization of Surface Microstructures on Bio-Based PLA Fabricated with Nano-Imprint Lithography

Authors: D. Bikiaris, M. Nerantzaki, I. Koliakou, A. Francone, N. Kehagias

Abstract:

In the present study, the formation of structures in poly(lactic acid) (PLA) has been investigated with respect to producing areas of regular, superficial features with dimensions comparable to those of cells or biological macromolecules. Nanoimprint lithography, a method of pattern replication in polymers, has been used for the production of features ranging from tens of micrometers, covering areas up to 1 cm², down to hundreds of nanometers. Both micro- and nano-structures were faithfully replicated. Potentially, PLA has wide uses within biomedical fields, from implantable medical devices, including screws and pins, to membrane applications, such as wound covers, and even as an injectable polymer for, for example, lipoatrophy. The possibility of fabricating structured PLA surfaces, with structures of the dimensions associated with cells or biological macro- molecules, is of interest in fields such as cellular engineering. Imprint-based technologies have demonstrated the ability to selectively imprint polymer films over large areas resulting in 3D imprints over flat, curved or pre-patterned surfaces. Here, we compare nano-patterned with nano-patterned by nanoimprint lithography (NIL) PLA film. A silicon nanostructured stamp (provided by Nanotypos company) having positive and negative protrusions was used to pattern PLA films by means of thermal NIL. The polymer film was heated from 40°C to 60°C above its Tg and embossed with a pressure of 60 bars for 3 min. The stamp and substrate were demolded at room temperature. Scanning electron microscope (SEM) images showed good replication fidelity of the replicated Si stamp. Contact-angle measurements suggested that positive microstructuring of the polymer (where features protrude from the polymer surface) produced a more hydrophilic surface than negative micro-structuring. The ability to structure the surface of the poly(lactic acid), allied to the polymer’s post-processing transparency and proven biocompatibility. Films produced in this were also shown to enhance the aligned attachment behavior and proliferation of Wharton’s Jelly Mesenchymal Stem cells, leading to the observed growth contact guidance. The bacterial attachment patterns of some bacteria, highlighted that the nano-patterned PLA structure can reduce the propensity for the bacteria to attach to the surface, with a greater bactericidal being demonstrated activity against the Staphylococcus aureus cells. These biocompatible, micro- and nanopatterned PLA surfaces could be useful for polymer– cell interaction experiments at dimensions at, or below, that of individual cells. Indeed, post-fabrication modification of the microstructured PLA surface, with materials such as collagen (which can further reduce the hydrophobicity of the surface), will extend the range of applications, possibly through the use of PLA’s inherent biodegradability. Further study is being undertaken to examine whether these structures promote cell growth on the polymer surface.

Keywords: poly(lactic acid), nano-imprint lithography, anti-bacterial properties, PLA

Procedia PDF Downloads 330
9153 Development of a New Method for the Evaluation of Heat Tolerant Wheat Genotypes for Genetic Studies and Wheat Breeding

Authors: Hameed Alsamadany, Nader Aryamanesh, Guijun Yan

Abstract:

Heat is one of the major abiotic stresses limiting wheat production worldwide. To identify heat tolerant genotypes, a newly designed system involving a large plastic box holding many layers of filter papers positioned vertically with wheat seeds sown in between for the ease of screening large number of wheat geno types was developed and used to study heat tolerance. A collection of 499 wheat geno types were screened under heat stress (35ºC) and non-stress (25ºC) conditions using the new method. Compared with those under non-stress conditions, a substantial and very significant reduction in seedling length (SL) under heat stress was observed with an average reduction of 11.7 cm (P<0.01). A damage index (DI) of each geno type based on SL under the two temperatures was calculated and used to rank the genotypes. Three hexaploid geno types of Triticum aestivum [Perenjori (DI= -0.09), Pakistan W 20B (-0.18) and SST16 (-0.28)], all growing better at 35ºC than at 25ºC were identified as extremely heat tolerant (EHT). Two hexaploid genotypes of T. aestivum [Synthetic wheat (0.93) and Stiletto (0.92)] and two tetraploid genotypes of T. turgidum ssp dicoccoides [G3211 (0.98) and G3100 (0.93)] were identified as extremely heat susceptible (EHS). Another 14 geno types were classified as heat tolerant (HT) and 478 as heat susceptible (HS). Extremely heat tolerant and heat susceptible geno types were used to develop re combinant inbreeding line populations for genetic studies. Four major QTLs, HTI4D, HTI3B.1, HTI3B.2 and HTI3A located on wheat chromosomes 4D, 3B (x2) and 3A, explaining up to 34.67 %, 28.93 %, 13.46% % and 11.34% phenotypic variation, respectively, were detected. The four QTLs together accounted for 88.40% of the total phenotypic variation. Random wheat geno types possessing the four heat tolerant alleles performed significantly better under the heat condition than those lacking the heat tolerant alleles indicating the importance of the four QTLs in conferring heat tolerance in wheat. Molecular markers are being developed for marker assisted breeding of heat tolerant wheat.

Keywords: bread wheat, heat tolerance, screening, RILs, QTL mapping, association analysis

Procedia PDF Downloads 551
9152 A Simplified Model of the Control System with PFM

Authors: Bekmurza H. Aitchanov, Sholpan K. Aitchanova, Olimzhon A. Baimuratov, Aitkul N. Aldibekova

Abstract:

This work considers the automated control system (ACS) of milk quality during its magnetic field processing. For achieving high level of quality control methods were applied transformation of complex nonlinear systems in a linearized system with a less complex structure. Presented ACS is adjustable by seven parameters: mass fraction of fat, mass fraction of dry skim milk residues (DSMR), density, mass fraction of added water, temperature, mass fraction of protein, acidity.

Keywords: fluids magnetization, nuclear magnetic resonance, automated control system, dynamic pulse-frequency modulator, PFM, nonlinear systems, structural model

Procedia PDF Downloads 375
9151 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines

Authors: Alexander Guzman Urbina, Atsushi Aoyama

Abstract:

The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.

Keywords: deep learning, risk assessment, neuro fuzzy, pipelines

Procedia PDF Downloads 292
9150 A Survey of the Applications of Sentiment Analysis

Authors: Pingping Lin, Xudong Luo

Abstract:

Natural language often conveys emotions of speakers. Therefore, sentiment analysis on what people say is prevalent in the field of natural language process and has great application value in many practical problems. Thus, to help people understand its application value, in this paper, we survey various applications of sentiment analysis, including the ones in online business and offline business as well as other types of its applications. In particular, we give some application examples in intelligent customer service systems in China. Besides, we compare the applications of sentiment analysis on Twitter, Weibo, Taobao and Facebook, and discuss some challenges. Finally, we point out the challenges faced in the applications of sentiment analysis and the work that is worth being studied in the future.

Keywords: application, natural language processing, online comments, sentiment analysis

Procedia PDF Downloads 263
9149 Porcelain Paste Processing by Robocasting 3D: Parameters Tuning

Authors: A. S. V. Carvalho, J. Luis, L. S. O. Pires, J. M. Oliveira

Abstract:

Additive manufacturing technologies (AM) experienced a remarkable growth in the latest years due to the development and diffusion of a wide range of three-dimensional (3D) printing techniques. Nowadays we can find techniques available for non-industrial users, like fused filament fabrication, but techniques like 3D printing, polyjet, selective laser sintering and stereolithography are mainly spread in the industry. Robocasting (R3D) shows a great potential due to its ability to shape materials with a wide range of viscosity. Industrial porcelain compositions showing different rheological behaviour can be prepared and used as candidate materials to be processed by R3D. The use of this AM technique in industry is very residual. In this work, a specific porcelain composition with suitable rheological properties will be processed by R3D, and a systematic study of the printing parameters tuning will be shown. The porcelain composition was formulated based on an industrial spray dried porcelain powder. The powder particle size and morphology was analysed. The powders were mixed with water and an organic binder on a ball mill at 200 rpm/min for 24 hours. The batch viscosity was adjusted by the addition of an acid solution and mixed again. The paste density, viscosity, zeta potential, particle size distribution and pH were determined. In a R3D system, different speed and pressure settings were studied to access their impact on the fabrication of porcelain models. These models were dried at 80 °C, during 24 hours and sintered in air at 1350 °C for 2 hours. The stability of the models, its walls and surface quality were studied and their physical properties were accessed. The microstructure and layer adhesion were observed by SEM. The studied processing parameters have a high impact on the models quality. Moreover, they have a high impact on the stacking of the filaments. The adequate tuning of the parameters has a huge influence on the final properties of the porcelain models. This work contributes to a better assimilation of AM technologies in ceramic industry. Acknowledgments: The RoboCer3D project – project of additive rapid manufacturing through 3D printing ceramic material (POCI-01-0247-FEDER-003350) financed by Compete 2020, PT 2020, European Regional Development Fund – FEDER through the International and Competitive Operational Program (POCI) under the PT2020 partnership agreement.

Keywords: additive manufacturing, porcelain, robocasting, R3D

Procedia PDF Downloads 163
9148 Hand in Hand with Indigenous People Worldwide through the Discovery of Indigenous Entrepreneurial Models: A Systematic Literature Review of International Indigenous Entrepreneurship

Authors: Francesca Croce

Abstract:

Governmental development strategies aimed at entrepreneurship as a major resource for economic development and poverty reduction of indigenous people. As initiatives and programs are local based, there is a need to better understand the contextual factors of indigenous entrepreneurial models. The purpose of this paper is, therefore, to analyze and integrated the indigenous entrepreneurship literature in order to identify the main models of indigenous entrepreneurship. To answer this need, a systematic literature review was conducted. Relevant articles were identified in selected electronic databases (ABI/Inform Global, Business Source Premier, Web of Science; International Bibliography of the Social Sciences, Academic Search, Sociological Abstract, Entrepreneurial Studies Sources and Bibliography of Native North America) and in selected electronic review. Beginning to 1st January 1995 (first International Day of the World’s Indigenous People), 59 academic articles were selected from 1411. Through systematic analysis of the cultural, social and organizational variables, the paper highlights that a typology of indigenous entrepreneurial models is possible thought the concept of entrepreneurial ecosystem, which includes the geographical position and the environment of the indigenous communities. The results show three models of indigenous entrepreneurship: the urban indigenous entrepreneurship, the semi-urban indigenous entrepreneurship, and rural indigenous entrepreneurship. After the introduction, the paper is organized as follows. In the first part theoretical and practical needs of a systematic literature review on indigenous entrepreneurship are provided. In the second part, the methodology, the selection process and evaluation of the articles are explained. In the third part, findings are presented and each indigenous entrepreneurial model characteristics are discussed. The results of this study bring a new theorization about indigenous entrepreneurship and may be useful for scientists in the field in search of overcoming the cognitive border of Indigenous business models still too little known. Also, the study is addressed to policy makers in charge of indigenous entrepreneurial development strategies more focused on contextual factors studies.

Keywords: community development, entrepreneurial ecosystem, indigenous entrepreneurship model, indigenous people, systematic literature review

Procedia PDF Downloads 280
9147 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences

Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi

Abstract:

Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.

Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort

Procedia PDF Downloads 213
9146 Efficacy and Mechanisms of Acupuncture for Depression: A Meta-Analysis of Clinical and Preclinical Evidence

Authors: Yimeng Zhang

Abstract:

Major depressive disorder (MDD) is a prevalent mental health condition with a substantial economic impact and limited treatment options. Acupuncture has gained attention as a promising non-pharmacological intervention for alleviating depressive symptoms. However, its mechanisms and clinical effectiveness remain incompletely understood. This meta-analysis aims to (1) synthesize existing evidence on the mechanisms and clinical effectiveness of acupuncture for depression and (2) compare these findings with pharmacological interventions, providing insights for future research. Evidence from animal models and clinical studies indicates that acupuncture may enhance hippocampal and network neuroplasticity and reduce brain inflammation, potentially alleviating depressive disorders. Clinical studies suggest that acupuncture can effectively relieve primary depression, particularly in milder cases, and is beneficial in managing post-stroke depression, pain-related depression, and postpartum depression, both as a standalone and adjunctive treatment. Notably, combining acupuncture with antidepressant pharmacotherapy appears to enhance treatment outcomes and reduce medication side effects, addressing a critical issue in conventional drug therapy's high dropout rates. This meta-analysis, encompassing 12 studies and 710 participants, draws data from eight digital databases (PubMed, EMBASE, Web of Science, EBSCOhost, CNKI, CBM, Wangfang, and CQVIP) covering the period from 2012 to 2022. Utilizing Stata software 15.0, the meta-analysis employed random-effects and fixed-effects models to assess the distribution of depression in Traditional Chinese Medicine (TCM). The results underscore the substantial evidence supporting acupuncture's beneficial effects on depression. However, the small sample sizes of many clinical trials raise concerns about the generalizability of the findings, indicating a need for further research to validate these outcomes and optimize acupuncture's role in treating depression.

Keywords: Chinese medicine, acupuncture, depression, meta-analysis

Procedia PDF Downloads 35
9145 Cut-Out Animation as an Technic and Development inside History Process

Authors: Armagan Gokcearslan

Abstract:

The art of animation has developed very rapidly from the aspects of script, sound and music, motion, character design, techniques being used and technological tools being developed since the first years until today. Technical variety attracts a particular attention in the art of animation. Being perceived as a kind of illusion in the beginning; animations commonly used the Flash Sketch technique. Animations artists using the Flash Sketch technique created scenes by drawing them on a blackboard with chalk. The Flash Sketch technique was used by primary animation artists like Emile Cohl, Winsor McCay ande Blackton. And then tools like Magical Lantern, Thaumatrope, Phenakisticope, and Zeotrap were developed and started to be used intensely in the first years of the art of animation. Today, on the other hand, the art of animation is affected by developments in the computer technology. It is possible to create three-dimensional and two-dimensional animations with the help of various computer software. Cut-out technique is among the important techniques being used in the art of animation. Cut-out animation technique is based on the art of paper cutting. Examining cut-out animations; it is observed that they technically resemble the art of paper cutting. The art of paper cutting has a rooted history. It is possible to see the oldest samples of paper cutting in the People’s Republic of China in the period after the 2. century B.C. when the Chinese invented paper. The most popular artist using the cut-out animation technique is the German artist Lotte Reiniger. This study titled “Cut-out Animation as a Technic and Development Inside History Process” will embrace the art of paper cutting, the relationship between the art of paper cutting and cut-out animation, its development within the historical process, animation artists producing artworks in this field, important cut-out animations, and their technical properties.

Keywords: cut-out, paper art, animation, technic

Procedia PDF Downloads 276