Search results for: corpus design methodology
16100 Optimization of High Flux Density Design for Permanent Magnet Motor
Authors: Dong-Woo Kang
Abstract:
This paper presents an optimal magnet shape of a spoke-shaped interior permanent magnet synchronous motor by using ferrite magnets. Generally, the permanent magnet motor used the ferrite magnets has lower output power and efficiency than a rare-earth magnet motor, because the ferrite magnet has lower magnetic energy than the rare-earth magnet. Nevertheless, the ferrite magnet motor is used to many industrial products owing to cost effectiveness. In this paper, the authors propose a high power density design of the ferrite permanent magnet synchronous motor. Furthermore, because the motor design has to be taken a manufacturing process into account, the design is simulated by using the finite element method for analyzing the demagnetization, the magnetizing, and the structure stiffness. Especially, the magnet shape and dimensions are decided for satisfying these properties. Finally, the authors design an optimal motor for applying our system. That final design is manufactured and evaluated from experimentations.Keywords: demagnetization, design optimization, magnetic analysis, permanent magnet motors
Procedia PDF Downloads 37716099 From Design, Experience and Play Framework to Common Design Thinking Tools: Using Serious Modern Board Games
Authors: Micael Sousa
Abstract:
Board games (BGs) are thriving as new designs emerge from the hobby community to greater audiences all around the world. Although digital games are gathering most of the attention in game studies and serious games research fields, the post-digital movement helps to explain why in the world dominated by digital technologies, the analog experiences are still unique and irreplaceable to users, allowing innovation in new hybrid environments. The BG’s new designs are part of these post-digital and hybrid movements because they result from the use of powerful digital tools that enable the production and knowledge sharing about the BGs and their face-to-face unique social experiences. These new BGs, defined as modern by many authors, are providing innovative designs and unique game mechanics that are still not yet fully explored by the main serious games (SG) approaches. Even the most established frameworks settled to address SG, as fun games implemented to achieve predefined goals need more development, especially when considering modern BGs. Despite the many anecdotic perceptions, researchers are only now starting to rediscover BGs and demonstrating their potentials. They are proving that BGs are easy to adapt and to grasp by non-expert players in experimental approaches, with the possibility of easy-going adaptation to players’ profiles and serious objectives even during gameplay. Although there are many design thinking (DT) models and practices, their relations with SG frameworks are also underdeveloped, mostly because this is a new research field, lacking theoretical development and the systematization of the experimental practices. Using BG as case studies promise to help develop these frameworks. Departing from the Design, Experience, and Play (DPE) framework and considering the Common Design Think Tools (CDST), this paper proposes a new experimental framework for the adaptation and development of modern BG design for DT: the Design, Experience, and Play for Think (DPET) experimental framework. This is done through the systematization of the DPE and CDST approaches applied in two case studies, where two different sequences of adapted BG were employed to establish a DT collaborative process. These two sessions occurred with different participants and in different contexts, also using different sequences of games for the same DT approach. The first session took place at the Faculty of Economics at the University of Coimbra in a training session of serious games for project development. The second session took place in the Casa do Impacto through The Great Village Design Jam light. Both sessions had the same duration and were designed to progressively achieve DT goals, using BGs as SGs in a collaborative process. The results from the sessions show that a sequence of BGs, when properly adapted to address the DPET framework, can generate a viable and innovative process of collaborative DT that is productive, fun, and engaging. The DPET proposed framework intents to help establish how new SG solutions could be defined for new goals through flexible DT. Applications in other areas of research and development can also benefit from these findings.Keywords: board games, design thinking, methodology, serious games
Procedia PDF Downloads 11416098 Damage Detection in a Cantilever Beam under Different Excitation and Temperature Conditions
Authors: A. Kyprianou, A. Tjirkallis
Abstract:
Condition monitoring of structures in service is very important as it provides information about the risk of damage development. One of the essential constituents of structural condition monitoring is the damage detection methodology. In the context of condition monitoring of in service structures a damage detection methodology analyses data obtained from the structure while it is in operation. Usually, this means that the data could be affected by operational and environmental conditions in a way that could mask the effects of a possible damage on the data. This, depending on the damage detection methodology, could lead to either false alarms or miss existing damages. In this article a damage detection methodology that is based on the Spatio-temporal continuous wavelet transform (SPT-CWT) analysis of a sequence of experimental time responses of a cantilever beam is proposed. The cantilever is subjected to white and pink noise excitation to simulate different operating conditions. In addition, in order to simulate changing environmental conditions, the cantilever is subjected to heating by a heat gun. The response of the cantilever beam is measured by a high-speed camera. Edges are extracted from the series of images of the beam response captured by the camera. Subsequent processing of the edges gives a series of time responses on 439 points on the beam. This sequence is then analyzed using the SPT-CWT to identify damage. The algorithm proposed was able to clearly identify damage under any condition when the structure was excited by white noise force. In addition, in the case of white noise excitation, the analysis could also reveal the position of the heat gun when it was used to heat the structure. The analysis could identify the different operating conditions i.e. between responses due to white noise excitation and responses due to pink noise excitation. During the pink noise excitation whereas damage and changing temperature were identified it was not possible to clearly identify the effect of damage from that of temperature. The methodology proposed in this article for damage detection enables the separation the damage effect from that due to temperature and excitation on data obtained from measurements of a cantilever beam. This methodology does not require information about the apriori state of the structure.Keywords: spatiotemporal continuous wavelet transform, damage detection, data normalization, varying temperature
Procedia PDF Downloads 27916097 Converse to the Sherman Inequality with Applications in Information Theory
Authors: Ana Barbir, S. Ivelic Bradanovic, D. Pecaric, J. Pecaric
Abstract:
We proved a converse to Sherman's inequality. Using the concept of f-divergence we obtained some inequalities for the well-known entropies, such as Shannon entropies that have many applications in many applied sciences, for example, in information theory, biology and economics Zipf-Mandelbrot law gave improvement in account for the low-rankwords in corpus. Applications of Zipf-Mandelbrot law can be found in linguistics, information sciences and also mostly applicable in ecological eld studies. We also introduced an entropy by applying the Zipf-Mandelbrot law and derived some related inequalities.Keywords: f-divergence, majorization inequality, Sherman inequality, Zipf-Mandelbrot entropy
Procedia PDF Downloads 17016096 Use of Six-sigma Concept in Discrete Manufacturing Industry
Authors: Ignatio Madanhire, Charles Mbohwa
Abstract:
Efficiency in manufacturing is critical in raising the value of exports so as to gainfully trade on the regional and international markets. There seems to be increasing popularity of continuous improvement strategies availed to manufacturing entities, but this research study established that there has not been a similar popularity accorded to the Six Sigma methodology. Thus this work was conducted to investigate the applicability, effectiveness, usefulness, application and suitability of the Six Sigma methodology as a competitiveness option for discrete manufacturing entity. Development of Six-sigma center in the country with continuous improvement information would go a long way in benefiting the entire industryKeywords: discrete manufacturing, six-sigma, continuous improvement, efficiency, competitiveness
Procedia PDF Downloads 46616095 Design and Stability Analysis of Fixed Wing – VTOL UAV
Authors: Omar Eldenali, Ahmed M. Bufares
Abstract:
There are primarily two types of Unmanned Aerial Vehicle (UAVs), namely, multirotor and fixed wing. Each type has its own advantages. This study introduces a design of a fixed wing vertical take-off and landing (VTOL) UAV. The design is classified as ready-to-fly (RTF) fixed wing UAV. This means that the UAV is capable of not only taking off, landing, or hovering like a multirotor aircraft but also cruising like a fixed wing UAV. In this study, the conceptual design of 15 kg takeoff weight twin-tail boom configuration FW-VTOL plane is carried out, the initial sizing of the plane is conducted, and both the horizontal and vertical tail configurations are estimated. Moreover, the power required for each stage of flight is determined. Finally, the stability analysis of the plane based on this design is performed, the results shows that this design based on the suggested flight mission is stable and can be utilized.Keywords: FW-VTOL, initial sizing, constrain analysis, stability
Procedia PDF Downloads 8816094 Design Development, Fabrication, and Preliminary Specifications of Multi-Fingered Prosthetic Hand
Authors: Mogeeb A. El-Sheikh
Abstract:
The study has developed the previous design of an artificial anthropomorphic humanoid hand and accustomed it as a prosthetic hand. The main specifications of this design are determined. The development of our previous design involves the main artificial hand’s parts and subassemblies, palm, fingers, and thumb. In addition, the study presents an adaptable socket design for a transradial amputee. This hand has 3 fingers and thumb. It is more reliable, cosmetics, modularity, and ease of assembly. Its size and weight are almost as a natural hand. The socket cavity has the capability for different sizes of a transradial amputee. The study implements the developed design by using rapid prototype and specifies its main specifications by using a data glove and finite element method.Keywords: adaptable socket, prosthetic hand, transradial amputee, data glove
Procedia PDF Downloads 26216093 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem
Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo
Abstract:
At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system
Procedia PDF Downloads 40316092 On the Semantics and Pragmatics of 'Be Able To': Modality and Actualisation
Authors: Benoît Leclercq, Ilse Depraetere
Abstract:
The goal of this presentation is to shed new light on the semantics and pragmatics of be able to. It presents the results of a corpus analysis based on data from the BNC (British National Corpus), and discusses these results in light of a specific stance on the semantics-pragmatics interface taking into account recent developments. Be able to is often discussed in relation to can and could, all of which can be used to express ability. Such an onomasiological approach often results in the identification of usage constraints for each expression. In the case of be able to, it is the formal properties of the modal expression (unlike can and could, be able to has non-finite forms) that are in the foreground, and the modal expression is described as the verb that conveys future ability. Be able to is also argued to expressed actualised ability in the past (I was able/could to open the door). This presentation aims to provide a more accurate pragmatic-semantic profile of be able to, based on extensive data analysis and one that is embedded in a very explicit view on the semantics-pragmatics interface. A random sample of 3000 examples (1000 for each modal verb) extracted from the BNC was analysed to account for the following issues. First, the challenge is to identify the exact semantic range of be able to. The results show that, contrary to general assumption, be able to does not only express ability but it shares most of the root meanings usually associated with the possibility modals can and could. The data reveal that what is called opportunity is, in fact, the most frequent meaning of be able to. Second, attention will be given to the notion of actualisation. It is commonly argued that be able to is the preferred form when the residue actualises: (1) The only reason he was able to do that was because of the restriction (BNC, spoken) (2) It is only through my imaginative shuffling of the aces that we are able to stay ahead of the pack. (BNC, written) Although this notion has been studied in detail within formal semantic approaches, empirical data is crucially lacking and it is unclear whether actualisation constitutes a conventional (and distinguishing) property of be able to. The empirical analysis provides solid evidence that actualisation is indeed a conventional feature of the modal. Furthermore, the dataset reveals that be able to expresses actualised 'opportunities' and not actualised 'abilities'. In the final part of this paper, attention will be given to the theoretical implications of the empirical findings, and in particular to the following paradox: how can the same expression encode both modal meaning (non-factual) and actualisation (factual)? It will be argued that this largely depends on one's conception of the semantics-pragmatics interface, and that this need not be an issue when actualisation (unlike modality) is analysed as a generalised conversational implicature and thus is considered part of the conventional pragmatic layer of be able to.Keywords: Actualisation, Modality, Pragmatics, Semantics
Procedia PDF Downloads 13316091 Automotive Emotions: An Investigation of Their Natures, Frequencies of Occurrence and Causes
Authors: Marlene Weber, Joseph Giacomin, Alessio Malizia, Lee Skrypchuk, Voula Gkatzidou
Abstract:
Technological and sociological developments in the automotive sector are shifting the focus of design towards developing a better understanding of driver needs, desires and emotions. Human centred design methods are being more frequently applied to automotive research, including the use of systems to detect human emotions in real-time. One method for a non-contact measurement of emotion with low intrusiveness is Facial-Expression Analysis (FEA). This paper describes a research study investigating emotional responses of 22 participants in a naturalistic driving environment by applying a multi-method approach. The research explored the possibility to investigate emotional responses and their frequencies during naturalistic driving through real-time FEA. Observational analysis was conducted to assign causes to the collected emotional responses. In total, 730 emotional responses were measured in the collective study time of 440 minutes. Causes were assigned to 92% of the measured emotional responses. This research establishes and validates a methodology for the study of emotions and their causes in the driving environment through which systems and factors causing positive and negative emotional effects can be identified.Keywords: affective computing, case study, emotion recognition, human computer interaction
Procedia PDF Downloads 20416090 Estimation of Elastic Modulus of Soil Surrounding Buried Pipeline Using Multi-Response Surface Methodology
Authors: Won Mog Choi, Seong Kyeong Hong, Seok Young Jeong
Abstract:
The stress on the buried pipeline under pavement is significantly affected by vehicle loads and elastic modulus of the soil surrounding the pipeline. The correct elastic modulus of soil has to be applied to the finite element model to investigate the effect of the vehicle loads on the buried pipeline using finite element analysis. The purpose of this study is to establish the approach to calculating the correct elastic modulus of soil using the optimization process. The optimal elastic modulus of soil, which minimizes the difference between the strain measured from vehicle driving test at the velocity of 35km/h and the strain calculated from finite element analyses, was calculated through the optimization process using multi-response surface methodology. Three elastic moduli of soil (road layer, original soil, dense sand) surrounding the pipeline were defined as the variables for the optimization. Further analyses with the optimal elastic modulus at the velocities of 4.27km/h, 15.47km/h, 24.18km/h were performed and compared to the test results to verify the applicability of multi-response surface methodology. The results indicated that the strain of the buried pipeline was mostly affected by the elastic modulus of original soil, followed by the dense sand and the load layer, as well as the results of further analyses with optimal elastic modulus of soil show good agreement with the test.Keywords: pipeline, optimization, elastic modulus of soil, response surface methodology
Procedia PDF Downloads 38716089 Part of Speech Tagging Using Statistical Approach for Nepali Text
Authors: Archit Yajnik
Abstract:
Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.Keywords: hidden markov model, natural language processing, POS tagging, viterbi algorithm
Procedia PDF Downloads 33016088 FengShui Paradigm as Philosophy of Sustainable Design
Authors: E. Erdogan, H. A. Erdogan
Abstract:
FengShui, an old Chinese discipline, dates back to more than 5000 years, is one of the design principles that aim at creating habitable and sustainable spaces in harmony with nature by systematizing data within its own structure. Having emerged from Chinese mysticism and embodying elements of faith in its principles, FengShui argues that the positive energy in the environment channels human behavior and psychology. This argument is supported with the thesis of quantum physics that ‘everything is made up of energy’ and gains an important place. In spaces where living and working take place with several principles and systematized rules, FengShui promises a happier, more peaceful and comfortable life by influencing human psychology, acts, and soul as well as the professional and social life of the individual. Observing these design properties in houses, workplaces, offices, the environment, and daily life as a design paradigm is significant. In this study, how FengShui, a Central Asian culture emanated from Chinese mysticism, shapes design and how it is used as an element of sustainable design will be explained.Keywords: Feng Shui, design principle, sustainability, philosophy
Procedia PDF Downloads 54216087 Photo-Fenton Decolorization of Methylene Blue Adsolubilized on Co2+ -Embedded Alumina Surface: Comparison of Process Modeling through Response Surface Methodology and Artificial Neural Network
Authors: Prateeksha Mahamallik, Anjali Pal
Abstract:
In the present study, Co(II)-adsolubilized surfactant modified alumina (SMA) was prepared, and methylene blue (MB) degradation was carried out on Co-SMA surface by visible light photo-Fenton process. The entire reaction proceeded on solid surface as MB was embedded on Co-SMA surface. The reaction followed zero order kinetics. Response surface methodology (RSM) and artificial neural network (ANN) were used for modeling the decolorization of MB by photo-Fenton process as a function of dose of Co-SMA (10, 20 and 30 g/L), initial concentration of MB (10, 20 and 30 mg/L), concentration of H2O2 (174.4, 348.8 and 523.2 mM) and reaction time (30, 45 and 60 min). The prediction capabilities of both the methodologies (RSM and ANN) were compared on the basis of correlation coefficient (R2), root mean square error (RMSE), standard error of prediction (SEP), relative percent deviation (RPD). Due to lower value of RMSE (1.27), SEP (2.06) and RPD (1.17) and higher value of R2 (0.9966), ANN was proved to be more accurate than RSM in order to predict decolorization efficiency.Keywords: adsolubilization, artificial neural network, methylene blue, photo-fenton process, response surface methodology
Procedia PDF Downloads 25516086 Structural Reliability of Existing Structures: A Case Study
Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol
Abstract:
A reliability-based methodology for the analysis assessment and evaluation of reinforced concrete structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for structural elements are verified by the results obtained from the deterministic methods. The analysis outcomes of reliability-based analysis are compared against the safety limits of the required reliability index β according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) related to the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the reinforced concrete elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.Keywords: structural reliability, concrete structures, FORM, Monte Carlo simulation
Procedia PDF Downloads 51816085 Analysis of Importance of Culture in Distributed Design Based on the Case Study at the University of Strathclyde
Authors: Zixuan Yang
Abstract:
This paper presents an analysis of the necessary consideration culture in distributed design through a thorough literature review and case study. The literature review has identified that the need for understanding cultural differences in product design and user evaluations is highlighted by analyzing cross-cultural influences; culture plays a significant role in distributed work, particularly in establishing team cohesion, trust, and credibility early in the project. By applying approaches of Geert Hofstede's dimensions and Fukuyama's trust analysis, a case study of a global design project, i.e., multicultural distributed teamwork solving the problem in terms of reducing the risk of deep vein thrombosis, showcases cultural dynamics, emphasizing trust-building and decision-making. The lessons learned emphasized the importance of cultural awareness, adaptability, and the utilization of scientific theories to enable effective cross-cultural collaborations in global design, providing valuable insights into navigating cultural diversity within design practices.Keywords: culture, distributed design, global design, Geert Hofstede's dimensions, Fukuyama's trust analysis
Procedia PDF Downloads 7116084 Integration of Design Management in the Product Development Process in SME's
Authors: Vitor Carneiro, Augusto Barata Da Rocha, Barbara Rangel, Jorge Lino Alves
Abstract:
In the European Union countries, Small and Medium-Sized Enterprises (SME’s) have an important contribution to economic activity and to the Gross Domestic Product (GDP). The implementation of design practices in SME’s is often a difficult task due to resources limitations. Unlike large companies, their product development and innovation processes frequentlylack adequate planning and systematic procedures. Design management interest has grown exponentially in recent years, but as it is a recent topic there is an absence of systematic methodologies to implement design management in SME’s with little or no design experience. This work presents a contribution to improve and optimize the process of design integration and management in SME’s. A review analysis is presented to select relevant articles on the subject, review and classify the main published contributions. Based on the selected articles content it was possible to identify five main themes related to the subject under analysis: Design Function Organization, Design Management Integration, Design Management Capabilities, Managing Design Projects, and Tools and Methods. Design management is discussed from different perspectives depending on the focus on which it is placed, whether in a design or management perspective, leading to different visions and definitions: from a more upstream strand at the intersection of design and the organization's strategic management (strategic design management) to a more downstream strand related to project management and design process (design management operational). The review analysis of the selected articles allowed the identification of a high level of complexity of connections and parameters in the design management during the product development process in the context of SME’s. Within each group of the five main themes, several sub-themes, directly or indirectly related, should be considered.Sub-connections also occur between sub-themes of different themes creating a complex and intricate web of connections. This complexity of connections is often the main obstacle to conduct design management and product development efficiently. This work proposes a formulation of a systematic methodological approach to optimize the integrated project and the management and control of the product development process among SME's. The implementation of this formulation will improve the integration of design management in the product development and innovation process in SME’s.Keywords: design management, product development, product innovation, SME’s.
Procedia PDF Downloads 22416083 How Is a Machine-Translated Literary Text Organized in Coherence? An Analysis Based upon Theme-Rheme Structure
Abstract:
With the ultimate goal to automatically generate translated texts with high quality, machine translation has made tremendous improvements. However, its translations of literary works are still plagued with problems in coherence, esp. the translation between distant language pairs. One of the causes of the problems is probably the lack of linguistic knowledge to be incorporated into the training of machine translation systems. In order to enable readers to better understand the problems of machine translation in coherence, to seek out the potential knowledge to be incorporated, and thus to improve the quality of machine translation products, this study applies Theme-Rheme structure to examine how a machine-translated literary text is organized and developed in terms of coherence. Theme-Rheme structure in Systemic Functional Linguistics is a useful tool for analysis of textual coherence. Theme is the departure point of a clause and Rheme is the rest of the clause. In a text, as Themes and Rhemes may be connected with each other in meaning, they form thematic and rhematic progressions throughout the text. Based on this structure, we can look into how a text is organized and developed in terms of coherence. Methodologically, we chose Chinese and English as the language pair to be studied. Specifically, we built a comparable corpus with two modes of English translations, viz. machine translation (MT) and human translation (HT) of one Chinese literary source text. The translated texts were annotated with Themes, Rhemes and their progressions throughout the texts. The annotated texts were analyzed from two respects, the different types of Themes functioning differently in achieving coherence, and the different types of thematic and rhematic progressions functioning differently in constructing texts. By analyzing and contrasting the two modes of translations, it is found that compared with the HT, 1) the MT features “pseudo-coherence”, with lots of ill-connected fragments of information using “and”; 2) the MT system produces a static and less interconnected text that reads like a list; these two points, in turn, lead to the less coherent organization and development of the MT than that of the HT; 3) novel to traditional and previous studies, Rhemes do contribute to textual connection and coherence though less than Themes do and thus are worthy of notice in further studies. Hence, the findings suggest that Theme-Rheme structure be applied to measuring and assessing the coherence of machine translation, to being incorporated into the training of the machine translation system, and Rheme be taken into account when studying the textual coherence of both MT and HT.Keywords: coherence, corpus-based, literary translation, machine translation, Theme-Rheme structure
Procedia PDF Downloads 20716082 Participatory Monitoring Strategy to Address Stakeholder Engagement Impact in Co-creation of NBS Related Project: The OPERANDUM Case
Authors: Teresa Carlone, Matteo Mannocchi
Abstract:
In the last decade, a growing number of International Organizations are pushing toward green solutions for adaptation to climate change. This is particularly true in the field of Disaster Risk Reduction (DRR) and land planning, where Nature-Based Solutions (NBS) had been sponsored through funding programs and planning tools. Stakeholder engagement and co-creation of NBS is growing as a practice and research field in environmental projects, fostering the consolidation of a multidisciplinary socio-ecological approach in addressing hydro-meteorological risk. Even thou research and financial interests are constantly spread, the NBS mainstreaming process is still at an early stage as innovative concepts and practices make it difficult to be fully accepted and adopted by a multitude of different actors to produce wide scale societal change. The monitoring and impact evaluation of stakeholders’ participation in these processes represent a crucial aspect and should be seen as a continuous and integral element of the co-creation approach. However, setting up a fit for purpose-monitoring strategy for different contexts is not an easy task, and multiple challenges emerge. In this scenario, the Horizon 2020 OPERANDUM project, designed to address the major hydro-meteorological risks that negatively affect European rural and natural territories through the co-design, co-deployment, and assessment of Nature-based Solution, represents a valid case study to test a monitoring strategy from which set a broader, general and scalable monitoring framework. Applying a participative monitoring methodology, based on selected indicators list that combines quantitative and qualitative data developed within the activity of the project, the paper proposes an experimental in-depth analysis of the stakeholder engagement impact in the co-creation process of NBS. The main focus will be to spot and analyze which factors increase knowledge, social acceptance, and mainstreaming of NBS, promoting also a base-experience guideline to could be integrated with the stakeholder engagement strategy in current and future similar strongly collaborative approach-based environmental projects, such as OPERANDUM. Measurement will be carried out through survey submitted at a different timescale to the same sample (stakeholder: policy makers, business, researchers, interest groups). Changes will be recorded and analyzed through focus groups in order to highlight causal explanation and to assess the proposed list of indicators to steer the conduction of similar activities in other projects and/or contexts. The idea of the paper is to contribute to the construction of a more structured and shared corpus of indicators that can support the evaluation of the activities of involvement and participation of various levels of stakeholders in the co-production, planning, and implementation of NBS to address climate change challenges.Keywords: co-creation and collaborative planning, monitoring, nature-based solution, participation & inclusion, stakeholder engagement
Procedia PDF Downloads 11516081 Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems
Authors: Baris Can Yalcin
Abstract:
Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.Keywords: design, mechatronics, motion sensor, data acquisition
Procedia PDF Downloads 58816080 Tolerance and Perspective towards Disability: A Mixed Methods Study
Authors: L. Koštić, P. Karaman
Abstract:
Society has a lot of diversities according to sex, age, religion, abilities or disabilities, education, etc. According to differences, everybody needs to be tolerated and equally included in society. In order to provide quality inclusion, society needs to tolerate differences. This study relates to the differences in disability. To examine tolerance towards disability and inclusion, this study was conducted with students attending regular elementary and high school. The main goal was to examine their attitudes towards their classmates and elderly people with disabilities. The study begins with the hypothesis that the environment has a highly developed tolerance towards people with disabilities, regardless of age. The sample was divided according to tasks and methodology analysis. Students attending regular elementary school were asked to make drawings of their classmates with disabilities. The drawings were analyzed using quantitative methodology according to the colors children used and the position of character on the paper. Students attending high school and members of general population were asked to complete a questionnaire designed for this study during a workshop held on the International Day for Tolerance. Responses were analyzed using qualitative methodology. The hypothesis was confirmed.Keywords: classmates, disability, students, tolerance
Procedia PDF Downloads 31216079 Optimization of Diluted Organic Acid Pretreatment on Rice Straw Using Response Surface Methodology
Authors: Rotchanaphan Hengaroonprasan, Malinee Sriariyanun, Prapakorn Tantayotai, Supacharee Roddecha, Kraipat Cheenkachorn
Abstract:
Lignocellolusic material is a substance that is resistant to be degraded by microorganisms or hydrolysis enzymes. To be used as materials for biofuel production, it needs pretreatment process to improve efficiency of hydrolysis. In this work, chemical pretreatments on rice straw using three diluted organic acids, including acetic acid, citric acid, oxalic acid, were optimized. Using Response Surface Methodology (RSM), the effect of three pretreatment parameters, acid concentration, treatment time, and reaction temperature, on pretreatment efficiency were statistically evaluated. The results indicated that dilute oxalic acid pretreatment led to the highest enhancement of enzymatic saccharification by commercial cellulase and yielded sugar up to 10.67 mg/ml when using 5.04% oxalic acid at 137.11 oC for 30.01 min. Compared to other acid pretreatment by acetic acid, citric acid, and hydrochloric acid, the maximum sugar yields are 7.07, 6.30, and 8.53 mg/ml, respectively. Here, it was demonstrated that organic acids can be used for pretreatment of lignocellulosic materials to enhance of hydrolysis process, which could be integrated to other applications for various biorefinery processes.Keywords: lignocellolusic biomass, pretreatment, organic acid response surface methodology, biorefinery
Procedia PDF Downloads 65416078 Design of Evaluation for Ehealth Intervention: A Participatory Study in Italy, Israel, Spain and Sweden
Authors: Monika Jurkeviciute, Amia Enam, Johanna Torres Bonilla, Henrik Eriksson
Abstract:
Introduction: Many evaluations of eHealth interventions conclude that the evidence for improved clinical outcomes is limited, especially when the intervention is short, such as one year. Often, evaluation design does not address the feasibility of achieving clinical outcomes. Evaluations are designed to reflect upon clinical goals of intervention without utilizing the opportunity to illuminate effects on organizations and cost. A comprehensive design of evaluation can better support decision-making regarding the effectiveness and potential transferability of eHealth. Hence, the purpose of this paper is to present a feasible and comprehensive design of evaluation for eHealth intervention, including the design process in different contexts. Methodology: The situation of limited feasibility of clinical outcomes was foreseen in the European Union funded project called “DECI” (“Digital Environment for Cognitive Inclusion”) that is run under the “Horizon 2020” program with an aim to define and test a digital environment platform within corresponding care models that help elderly people live independently. A complex intervention of eHealth implementation into elaborate care models in four different countries was planned for one year. To design the evaluation, a participative approach was undertaken using Pettigrew’s lens of change and transformations, including context, process, and content. Through a series of workshops, observations, interviews, and document analysis, as well as a review of scientific literature, a comprehensive design of evaluation was created. Findings: The findings indicate that in order to get evidence on clinical outcomes, eHealth interventions should last longer than one year. The content of the comprehensive evaluation design includes a collection of qualitative and quantitative methods for data gathering which illuminates non-medical aspects. Furthermore, it contains communication arrangements to discuss the results and continuously improve the evaluation design, as well as procedures for monitoring and improving the data collection during the intervention. The process of the comprehensive evaluation design consists of four stages: (1) analysis of a current state in different contexts, including measurement systems, expectations and profiles of stakeholders, organizational ambitions to change due to eHealth integration, and the organizational capacity to collect data for evaluation; (2) workshop with project partners to discuss the as-is situation in relation to the project goals; (3) development of general and customized sets of relevant performance measures, questionnaires and interview questions; (4) setting up procedures and monitoring systems for the interventions. Lastly, strategies are presented on how challenges can be handled during the design process of evaluation in four different countries. The evaluation design needs to consider contextual factors such as project limitations, and differences between pilot sites in terms of eHealth solutions, patient groups, care models, national and organizational cultures and settings. This implies a need for the flexible approach to evaluation design to enable judgment over the effectiveness and potential for adoption and transferability of eHealth. In summary, this paper provides learning opportunities for future evaluation designs of eHealth interventions in different national and organizational settings.Keywords: ehealth, elderly, evaluation, intervention, multi-cultural
Procedia PDF Downloads 32416077 Development and Automation of Medium-Scale NFT Hydroponic Systems: Design Methodology and State of the Art Review
Authors: Oscar Armando González-Marin, Jhon F. Rodríguez-León, Oscar Mota-Pérez, Jorge Pineda-Piñón, Roberto S. Velázquez-González., Julio C. Sosa-Savedra
Abstract:
Over the past six years, the World Meteorological Organization (WMO) has recorded the warmest years since 1880, primarily attributed to climate change. In addition, the overexploitation of agricultural lands, combined with food and water scarcity, has highlighted the urgent need for sustainable cultivation methods. Hydroponics has emerged as a sustainable farming technique that enables plant cultivation using nutrient solutions without the requirement for traditional soil. Among hydroponic methods, the Nutrient Film Technique (NFT) facilitates plant growth by circulating a nutrient solution continuously. This approach allows the monitoring and precise control of nutritional parameters, with potential for automation and technological integration. This study aims to present the state of the art of automated NFT hydroponic systems, discussing their design methodologies and considerations for implementation. Moreover, a medium-scale NFT system developed at CICATA-QRO is introduced, detailing its current manual management and progress toward automation.Keywords: automation, hydroponics, nutrient film technique, sustainability
Procedia PDF Downloads 4616076 A Resistant-Based Comparative Study between Iranian Concrete Design Code and Some Worldwide Ones
Authors: Seyed Sadegh Naseralavi, Najmeh Bemani
Abstract:
The design in most counties should be inevitably carried out by their native code such as Iran. Since the Iranian concrete code does not exist in structural design software, most engineers in this country analyze the structures using commercial software but design the structural members manually. This point motivated us to make a communication between Iranian code and some other well-known ones to create facility for the engineers. Finally, this paper proposes the so-called interpretation charts which help specify the position of Iranian code in comparison of some worldwide ones.Keywords: beam, concrete code, strength, interpretation charts
Procedia PDF Downloads 52716075 Using Multi-Arm Bandits to Optimize Game Play Metrics and Effective Game Design
Authors: Kenny Raharjo, Ramon Lawrence
Abstract:
Game designers have the challenging task of building games that engage players to spend their time and money on the game. There are an infinite number of game variations and design choices, and it is hard to systematically determine game design choices that will have positive experiences for players. In this work, we demonstrate how multi-arm bandits can be used to automatically explore game design variations to achieve improved player metrics. The advantage of multi-arm bandits is that they allow for continuous experimentation and variation, intrinsically converge to the best solution, and require no special infrastructure to use beyond allowing minor game variations to be deployed to users for evaluation. A user study confirms that applying multi-arm bandits was successful in determining the preferred game variation with highest play time metrics and can be a useful technique in a game designer's toolkit.Keywords: game design, multi-arm bandit, design exploration and data mining, player metric optimization and analytics
Procedia PDF Downloads 51116074 The Visual Side of Islamophobia: A Social-Semiotic Analysis
Authors: Carmen Aguilera-Carnerero
Abstract:
Islamophobia, the unfounded hostility towards Muslims and Islam, has been deeply studied in the last decades from different perspectives ranging from anthropology, sociology, media studies, and linguistics. In the past few years, we have witnessed how the birth of social media has transformed formerly passive audiences into an active group that not only receives and digests information but also creates and comments publicly on any event of their interest. In this way, average citizens now have been entitled with the power of becoming potential opinion leaders. This rise of social media in the last years gave way to a different way of Islamophobia, the so called ‘cyberIslamophobia’. Considerably less attention, however, has been given to the study of islamophobic images that accompany the texts in social media. This paper attempts to analyse a corpus of 300 images of islamophobic nature taken from social media (from Twitter and Facebook) from the years 2014-2017 to see: a) how hate speech is visually constructed, b) how cyberislamophobia is articulated through images and whether there are differences/similarities between the textual and the visual elements, c) the impact of those images in the audience and their reaction to it and d) whether visual cyberislamophobia has undergone any process of permeating popular culture (for example, through memes) and its real impact. To carry out this task, we have used Critical Discourse Analysis as the most suitable theoretical framework that analyses and criticizes the dominant discourses that affect inequality, injustice, and oppression. The analysis of images was studied according to the theoretical framework provided by the visual framing theory and the visual design grammar to conclude that memes are subtle but very powerful tools to spread Islamophobia and foster hate speech under the guise of humour within popular culture.Keywords: cyberIslamophobia, visual grammar, social media, popular culture
Procedia PDF Downloads 17016073 Discovering Groundbreaking Geopolymer-Based Materials with Versatile Designs, Ideal for the Construction and Infrastructure Industry
Authors: Maryam Kiani
Abstract:
Geopolymer has gained significant prominence worldwide and is now widely regarded as a potential alternative to conventional Portland cement. Nevertheless, for it to be widely accepted and incorporated into national and international standards, it is crucial to establish precise definitions and dependable mix design methodologies for geopolymer materials. The lack of a common definition and methodology has led to inconsistencies and perplexity across various areas of research. Addressing this concern is imperative for several reasons. To overcome the existing inconsistencies and confusion, concerted efforts should be made to establish clear definitions and robust mix design methodologies for geopolymer materials. This can be achieved through collaborative research, knowledge sharing, and engagement with industry experts. By doing so, we can pave the way for the widespread acceptance and utilization of geopolymer materials, revolutionizing the construction and infrastructure industry in a sustainable and environmentally friendly manner. The primary goal of this article is to offer clear explanations regarding the different meanings of geopolymer and the various methodologies used in geopolymer processes. Its main aim is to improve comprehension of both unary and binary geopolymer systems. By thoroughly exploring existing research, this article strives to illuminate the diverse methods and techniques utilized in the exciting field of geopolymer science.Keywords: geopolymer, nanomaterials, structural materials, mechanical properties
Procedia PDF Downloads 11516072 Transferring of Digital DIY Potentialities through a Co-Design Tool
Authors: Marita Canina, Carmen Bruno
Abstract:
Digital Do It Yourself (DIY) is a contemporary socio-technological phenomenon, enabled by technological tools. The nature and potential long-term effects of this phenomenon have been widely studied within the framework of the EU funded project ‘Digital Do It Yourself’, in which the authors have created and experimented a specific Digital Do It Yourself (DiDIY) co-design process. The phenomenon was first studied through a literature research to understand its multiple dimensions and complexity. Therefore, co-design workshops were used to investigate the phenomenon by involving people to achieve a complete understanding of the DiDIY practices and its enabling factors. These analyses allowed the definition of the DiDIY fundamental factors that were then translated into a design tool. The objective of the tool is to shape design concepts by transferring these factors into different environments to achieve innovation. The aim of this paper is to present the ‘DiDIY Factor Stimuli’ tool, describing the research path and the findings behind it.Keywords: co-design process, digital DIY, innovation, toolkit
Procedia PDF Downloads 17916071 Battery Grading Algorithm in 2nd-Life Repurposing LI-Ion Battery System
Authors: Ya L. V., Benjamin Ong Wei Lin, Wanli Niu, Benjamin Seah Chin Tat
Abstract:
This article introduces a methodology that improves reliability and cyclability of 2nd-life Li-ion battery system repurposed as an energy storage system (ESS). Most of the 2nd-life retired battery systems in the market have module/pack-level state-of-health (SOH) indicator, which is utilized for guiding appropriate depth-of-discharge (DOD) in the application of ESS. Due to the lack of cell-level SOH indication, the different degrading behaviors among various cells cannot be identified upon reaching retired status; in the end, considering end-of-life (EOL) loss and pack-level DOD, the repurposed ESS has to be oversized by > 1.5 times to complement the application requirement of reliability and cyclability. This proposed battery grading algorithm, using non-invasive methodology, is able to detect outlier cells based on historical voltage data and calculate cell-level historical maximum temperature data using semi-analytic methodology. In this way, the individual battery cell in the 2nd-life battery system can be graded in terms of SOH on basis of the historical voltage fluctuation and estimated historical maximum temperature variation. These grades will have corresponding DOD grades in the application of the repurposed ESS to enhance system reliability and cyclability. In all, this introduced battery grading algorithm is non-invasive, compatible with all kinds of retired Li-ion battery systems which lack of cell-level SOH indication, as well as potentially being embedded into battery management software for preventive maintenance and real-time cyclability optimization.Keywords: battery grading algorithm, 2nd-life repurposing battery system, semi-analytic methodology, reliability and cyclability
Procedia PDF Downloads 204