Search results for: embedded logic
322 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach
Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika
Abstract:
Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments
Procedia PDF Downloads 219321 Integrating a Six Thinking Hats Approach Into the Prewriting Stage of Argumentative Writing In English as a Foreign Language: A Chinese Case Study of Generating Ideas in Action
Abstract:
Argumentative writing is the most prevalent genre in diverse writing tests. How to construct academic arguments is often regarded as a difficult task by most English as a foreign language (EFL) learners. A failure to generate enough ideas and organise them coherently and logically as well as a lack of competence in supporting their arguments with relevant evidence are frequent problems faced by EFL learners when approaching an English argumentative writing task. Overall, these problems are closely related to planning, and planning an argumentative writing at pre-writing stage plays a vital role in a good academic essay. However, how teachers can effectively guide students to generate ideas is rarely discussed in planning English argumentative writing, apart from brainstorming. Brainstorming has been a common practice used by teachers to help students generate ideas. However, some limitations of brainstorming suggest that it can help students generate many ideas, but ideas might not necessarily be coherent and logic, and could sometimes impede production. It calls for a need to explore effective instructional strategies at pre-writing stage of English argumentative writing. This paper will first examine how a Six Thinking Hats approach can be used to provide a dialogic space for EFL learners to experience and collaboratively generate ideas from multiple perspectives at pre-writing stage. Part of the findings of the impact of a twelve-week intervention (from March to July 2021) on students learning to generate ideas through engaging in group discussions of using Six Thinking Hats will then be reported. The research design is based on the sociocultural theory. The findings present evidence from a mixed-methods approach and fifty-nine participants from two first-year undergraduate natural classes in a Chinese university. Analysis of pre- and post- questionnaires suggests that participants had a positive attitude toward the Six Thinking Hats approach. It fosters their understanding of prewriting and argumentative writing, helps them to generate more ideas not only from multiple perspectives but also in a systematic way. A comparison of participants writing plans confirms an improvement in generating counterarguments and rebuttals to support their arguments. Above all, visual and transcripts data of group discussion collected from different weeks throughout the intervention enable teachers and researchers to ‘see’ the hidden process of learning to generate ideas in action.Keywords: argumentative writing, innovative pedagogy, six thinking hats, dialogic space, prewriting, higher education
Procedia PDF Downloads 89320 Start-Up: The Perception of Brazilian Entrepreneurs about the Start-Up Brasil Program
Authors: Fernando Nobre Cavalcante
Abstract:
In Brazil, and more recently in the city of Fortaleza, there is a new form of entrepreneurship that is focused on the information and communication technology service sector and that draws the attention of young people, investors, governments, authors and media companies: it is known as the start-up movement. Today, it is considered to be a driving force behind the creative economy. Rooted on progressive discourse, the words enterprise and innovation seduce new economic agents motivated by success stories from Silicon Valley in America along with increasing commercial activity for digital goods and services. This article assesses, from a sociological point of view, the new productive wave problematized by the light of Manuel Castells’ informational capitalism. Considering the skeptical as well as the optimistic opinions about the impact of this new entrepreneurial rearrangement, the following question is asked: How Brazilian entrepreneurs evaluate public policy incentives for startups Brazilian Federal Government? The raised hypotheses are based on employability factors as well as cultural, economical, and political matters related to innovation and technology. This study has produced a nationwide quantitative assessment with a special focus on the reality of these Ceará firms; as well as comparative qualitative interviews on Brazilian experiences lived by identified agents. This article outlines the public incentive policy of the federal government, the Start-up Brasil Program, from the perspective of these companies and provides details as to the discipline methods of the new enterprising way born in the United States. The startups are very young companies that are headed towards the economic sustainment of the productive sector services. These companies are dropping the seeds that will produce the re-enchantment of young people and bring them back to participation in political debate; they provide relief and reheats the job market; and they produce a democratization of the entrepreneurial ‘Do-It-Yourself’ culture. They capitalize the pivot of the wall street wolves and of agents being charged for new masks. There are developmental logic’s prophylaxis in the face of dreadful innovation stagnation. The lack of continuity in Brazilian governmental politics and cultural nuances related to entrepreneurship are barring the desired regional success of this ecosystem.Keywords: creative economy, entrepreneurship, informationalism, innovation, startups, start-up brasil program
Procedia PDF Downloads 369319 The Changing Role of Technology-Enhanced University Library Reform in Improving College Student Learning Experience and Career Readiness – A Qualitative Comparative Analysis (QCA)
Authors: Xiaohong Li, Wenfan Yan
Abstract:
Background: While it is widely considered that the university library plays a critical role in fulfilling the institution's mission and providing students’ learning experience beyond the classrooms, how the technology-enhanced library reform changed college students’ learning experience hasn’t been thoroughly investigated. The purpose of this study is to explore how technology-enhanced library reform affects students’ learning experience and career readiness and further identify the factors and effective conditions that enable the quality learning outcome of Chinese college students. Methodologies: This study selected the qualitative comparative analysis (QCA) method to explore the effects of technology-enhanced university library reform on college students’ learning experience and career readiness. QCA is unique in explaining the complex relationship between multiple factors from a holistic perspective. Compared with the traditional quantitative and qualitative analysis, QCA not only adds some quantitative logic but also inherits the characteristics of qualitative research focusing on the heterogeneity and complexity of samples. Shenyang Normal University (SNU) selected a sample of the typical comprehensive university in China that focuses on students’ learning and application of professional knowledge and trains professionals to different levels of expertise. A total of 22 current university students and 30 graduates who joined the Library Readers Association of SNU from 2011 to 2019 were selected for semi-structured interviews. Based on the data collected from these participating students, qualitative comparative analysis (QCA), including univariate necessity analysis and the multi-configuration analysis, was conducted. Findings and Discussion: QCA analysis results indicated that the influence of technology-enhanced university library restructures and reorganization on student learning experience and career readiness is the result of multiple factors. Technology-enhanced library equipment and other hardware restructured to meet the college students learning needs and have played an important role in improving the student learning experience and learning persistence. More importantly, the soft characteristics of technology-enhanced library reform, such as library service innovation space and culture space, have a positive impact on student’s career readiness and development. Technology-enhanced university library reform is not only the change in the building's appearance and facilities but also in library service quality and capability. The study also provides suggestions for policy, practice, and future research.Keywords: career readiness, college student learning experience, qualitative comparative analysis (QCA), technology-enhanced library reform
Procedia PDF Downloads 79318 Urban Logistics Dynamics: A User-Centric Approach to Traffic Modelling and Kinetic Parameter Analysis
Authors: Emilienne Lardy, Eric Ballot, Mariam Lafkihi
Abstract:
Efficient urban logistics requires a comprehensive understanding of traffic dynamics, particularly as it pertains to kinetic parameters influencing energy consumption and trip duration estimations. While real-time traffic information is increasingly accessible, current high-precision forecasting services embedded in route planning often function as opaque 'black boxes' for users. These services, typically relying on AI-processed counting data, fall short in accommodating open design parameters essential for management studies, notably within Supply Chain Management. This work revisits the modelling of traffic conditions in the context of city logistics, emphasizing its significance from the user’s point of view, with two focuses. Firstly, the focus is not on the vehicle flow but on the vehicles themselves and the impact of the traffic conditions on their driving behaviour. This means opening the range of studied indicators beyond vehicle speed, to describe extensively the kinetic and dynamic aspects of the driving behaviour. To achieve this, we leverage the Art. Kinema parameters are designed to characterize driving cycles. Secondly, this study examines how the driving context (i.e., exogenous factors to the traffic flow) determines the mentioned driving behaviour. Specifically, we explore how accurately the kinetic behaviour of a vehicle can be predicted based on a limited set of exogenous factors, such as time, day, road type, orientation, slope, and weather conditions. To answer this question, statistical analysis was conducted on real-world driving data, which includes high-frequency measurements of vehicle speed. A Factor Analysis and a Generalized Linear Model have been established to link kinetic parameters with independent categorical contextual variables. The results include an assessment of the adjustment quality and the robustness of the models, as well as an overview of the model’s outputs.Keywords: factor analysis, generalised linear model, real world driving data, traffic congestion, urban logistics, vehicle kinematics
Procedia PDF Downloads 67317 Building Student Empowerment through Live Commercial Projects: A Reflective Account of Participants
Authors: Nilanthi Ratnayake, Wen-Ling Liu
Abstract:
Prior research indicates an increasing gap between the skills and capabilities of graduates in the contemporary workplace across the globe. The challenge of addressing this issue primarily lies on the hands of higher education institutes/universities. In particular, surveys of UK employers and retailers found that soft skills including communication, numeracy, teamwork, confidence, analytical ability, digital/IT skills, business sense, language, and social skills are highly valued by graduate employers, and in achieving this, there are various assessed and non-assessed learning exercises have already been embedded into the university curriculum. To this end, this research study aims to explore the reflections of postgraduate student participation in a live commercial project (i.e. designing an advertising campaign for open days, summer school etc.) implemented with the intention of offering a transformative experience by deploying this project. Qualitative research methodology has been followed in this study, collecting data from three types of target audiences; students, academics and employers via a series of personal interviews and focus group discussions. Recorded data were transcribed, entered into NVIVO, and analysed using meaning condensation and content analysis. Students reported that they had a very positive impact towards improving self-efficacy, especially in relation to soft skills and confidence in seeking employment opportunities. In addition, this project has reduced cultural barriers for international students in general communications. Academic staff and potential employers who attended on the presentation day expressed their gratitude for offering a lifelong experience for students, and indeed believed that these type of projects contribute significantly to enhance skills and capabilities of students to cater the demands of employers. In essence, key findings demonstrate that an integration of knowledge-based skills into a live commercial project facilitate individuals to make the transition from education to employment in terms of skills, abilities and work behaviours more effectively in comparison to some other activities/assuagements that are currently in place in higher education institutions/universities.Keywords: soft skills, commercially live project, higher education, student participation
Procedia PDF Downloads 361316 Influence of Deficient Materials on the Reliability of Reinforced Concrete Members
Authors: Sami W. Tabsh
Abstract:
The strength of reinforced concrete depends on the member dimensions and material properties. The properties of concrete and steel materials are not constant but random variables. The variability of concrete strength is due to batching errors, variations in mixing, cement quality uncertainties, differences in the degree of compaction and disparity in curing. Similarly, the variability of steel strength is attributed to the manufacturing process, rolling conditions, characteristics of base material, uncertainties in chemical composition, and the microstructure-property relationships. To account for such uncertainties, codes of practice for reinforced concrete design impose resistance factors to ensure structural reliability over the useful life of the structure. In this investigation, the effects of reductions in concrete and reinforcing steel strengths from the nominal values, beyond those accounted for in the structural design codes, on the structural reliability are assessed. The considered limit states are flexure, shear and axial compression based on the ACI 318-11 structural concrete building code. Structural safety is measured in terms of a reliability index. Probabilistic resistance and load models are compiled from the available literature. The study showed that there is a wide variation in the reliability index for reinforced concrete members designed for flexure, shear or axial compression, especially when the live-to-dead load ratio is low. Furthermore, variations in concrete strength have minor effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and sever effect on the reliability of columns in axial compression. On the other hand, changes in steel yield strength have great effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and mild effect on the reliability of columns in axial compression. Based on the outcome, it can be concluded that the reliability of beams is sensitive to changes in the yield strength of the steel reinforcement, whereas the reliability of columns is sensitive to variations in the concrete strength. Since the embedded target reliability in structural design codes results in lower structural safety in beams than in columns, large reductions in material strengths compromise the structural safety of beams much more than they affect columns.Keywords: code, flexure, limit states, random variables, reinforced concrete, reliability, reliability index, shear, structural safety
Procedia PDF Downloads 430315 Biophysical Assessment of the Ecological Condition of Wetlands in the Parkland and Grassland Natural Regions of Alberta, Canada
Authors: Marie-Claude Roy, David Locky, Ermias Azeria, Jim Schieck
Abstract:
It is estimated that up to 70% of the wetlands in the Parkland and Grassland natural regions of Alberta have been lost due to various land-use activities. These losses include ecosystem function and services they once provided. Those wetlands remaining are often embedded in a matrix of human-modified habitats and despite efforts taken to protect them the effects of land-uses on wetland condition and function remain largely unknown. We used biophysical field data and remotely-sensed human footprint data collected at 322 open-water wetlands by the Alberta Biodiversity Monitoring Institute (ABMI) to evaluate the impact of surrounding land use on the physico-chemistry characteristics and plant functional traits of wetlands. Eight physio-chemistry parameters were assessed: wetland water depth, water temperature, pH, salinity, dissolved oxygen, total phosphorus, total nitrogen, and dissolved organic carbon. Three plant functional traits were evaluated: 1) origin (native and non-native), 2) life history (annual, biennial, and perennial), and 3) habitat requirements (obligate-wetland and obligate-upland). Intensity land-use was quantified within a 250-meter buffer around each wetland. Ninety-nine percent of wetlands in the Grassland and Parkland regions of Alberta have land-use activities in their surroundings, with most being agriculture-related. Total phosphorus in wetlands increased with the cover of surrounding agriculture, while salinity, total nitrogen, and dissolved organic carbon were positively associated with the degree of soft-linear (e.g. pipelines, trails) land-uses. The abundance of non-native and annual/biennial plants increased with the amount of agriculture, while urban-industrial land-use lowered abundance of natives, perennials, and obligate wetland plants. Our study suggests that land-use types surrounding wetlands affect the physicochemical and biological conditions of wetlands. This research suggests that reducing human disturbances through reclamation of wetland buffers may enhance the condition and function of wetlands in agricultural landscapes.Keywords: wetlands, biophysical assessment, land use, grassland and parkland natural regions
Procedia PDF Downloads 334314 A New Method Separating Relevant Features from Irrelevant Ones Using Fuzzy and OWA Operator Techniques
Authors: Imed Feki, Faouzi Msahli
Abstract:
Selection of relevant parameters from a high dimensional process operation setting space is a problem frequently encountered in industrial process modelling. This paper presents a method for selecting the most relevant fabric physical parameters for each sensory quality feature. The proposed relevancy criterion has been developed using two approaches. The first utilizes a fuzzy sensitivity criterion by exploiting from experimental data the relationship between physical parameters and all the sensory quality features for each evaluator. Next an OWA aggregation procedure is applied to aggregate the ranking lists provided by different evaluators. In the second approach, another panel of experts provides their ranking lists of physical features according to their professional knowledge. Also by applying OWA and a fuzzy aggregation model, the data sensitivity-based ranking list and the knowledge-based ranking list are combined using our proposed percolation technique, to determine the final ranking list. The key issue of the proposed percolation technique is to filter automatically and objectively the relevant features by creating a gap between scores of relevant and irrelevant parameters. It permits to automatically generate threshold that can effectively reduce human subjectivity and arbitrariness when manually choosing thresholds. For a specific sensory descriptor, the threshold is defined systematically by iteratively aggregating (n times) the ranking lists generated by OWA and fuzzy models, according to a specific algorithm. Having applied the percolation technique on a real example, of a well known finished textile product especially the stonewashed denims, usually considered as the most important quality criteria in jeans’ evaluation, we separate the relevant physical features from irrelevant ones for each sensory descriptor. The originality and performance of the proposed relevant feature selection method can be shown by the variability in the number of physical features in the set of selected relevant parameters. Instead of selecting identical numbers of features with a predefined threshold, the proposed method can be adapted to the specific natures of the complex relations between sensory descriptors and physical features, in order to propose lists of relevant features of different sizes for different descriptors. In order to obtain more reliable results for selection of relevant physical features, the percolation technique has been applied for combining the fuzzy global relevancy and OWA global relevancy criteria in order to clearly distinguish scores of the relevant physical features from those of irrelevant ones.Keywords: data sensitivity, feature selection, fuzzy logic, OWA operators, percolation technique
Procedia PDF Downloads 605313 Determinants of Youth Engagement with Health Information on Social Media Platforms in United Arab Emirates
Authors: Niyi Awofeso, Yunes Gaber, Moyosola Bamidele
Abstract:
Since most social media platforms are accessible anytime and anywhere where Internet connections and smartphones are available, the invisibility of the reader raises questions about accuracy, appropriateness and comprehensibility of social media communication. Furthermore, the identity and motives of individuals and organizations who post articles on social media sites are not always transparent. In the health sector, through socially networked platforms constitute a common source of health-related information, given their purported wealth of information. Nevertheless, fake blogs and sponsored postings for marketing 'natural cures' pervade most commonly used social media platforms, thus complicating readers’ abilities to access and understand trustworthy health-related information. This purposive sampling study of 120 participants aged 18-35 year in UAE was conducted between September and December 2017, and explored commonly used social media platforms, frequency of use of social media for accessing health related information, and approaches for assessing the trustworthiness of health information on social media platforms. Results indicate that WhatsApp (95%), Instagram (87%) and Youtube (82%) were the most commonly used social media platforms among respondents. Majority of respondents (81%) indicated that they regularly access social media to get health-associated information. More than half of respondents (55%) with non-chronic health status relied on unsolicited messages to obtain health-related information. Doctors’ health blogs (21%) and social media sites of international healthcare organizations (20%) constitute the most trusted source of health information among respondents, with UAE government health agencies’ social media accounts trusted by 15% of respondents. Cardiovascular diseases, diabetes, and hypertension were the most commonly searched topics on social media (29%), followed by nutrition (20%) and skin care (16%). Majority of respondents (41%) rely on reliability of hits on Google search engines, 22% check for health information only from 'reliable' social media sites, while 8% utilize 'logic' to ascertain reliability of health information. As social media has rapidly become an integral part of the health landscape, it is important that health care policy makers, healthcare providers and social media companies collaborate to promote the positive aspects of social media for young people, whilst mitigating the potential negatives. Utilizing popular social media platforms for posting reader-friendly health information will achieve high coverage. Improving youth digital literacy will facilitate easier access to trustworthy information on the internet.Keywords: social media, United Arab Emirates, youth engagement, digital literacy
Procedia PDF Downloads 120312 Beam Deflection with Unidirectionality Due to Zeroth Order and Evanescent Wave Coupling in a Photonic Crystal with a Defect Layer without Corrugations under Oblique Incidence
Authors: Evrim Colak, Andriy E. Serebryannikov, Thore Magath, Ekmel Ozbay
Abstract:
Single beam deflection and unidirectional transmission are examined for oblique incidence in a Photonic Crystal (PC) structure which employs defect layer instead of surface corrugations at the interfaces. In all of the studied cases, the defect layer is placed such that the symmetry is broken. Two types of deflection are observed depending on whether the zeroth order is coupled or not. These two scenarios can be distinguished from each other by considering the simulated field distribution in PC. In the first deflection type, Floquet-Bloch mode enables zeroth order coupling. The energy of the zeroth order is redistributed between the diffraction orders at the defect layer, providing deflection. In the second type, when zeroth order is not coupled, strong diffractions cause blazing and the evanescent waves deliver energy to higher order diffraction modes. Simulated isofrequency contours can be utilized to estimate the coupling behavior. The defect layer is placed at varying rows, preserving the asymmetry of PC while evancescent waves can still couple to higher order modes. Even for deeply buried defect layer, asymmetric transmission and beam deflection are still encountered when the zeroth order is not coupled. We assume ε=11.4 (refractive index close to that of GaAs and Si) for the PC rods. A possible operation wavelength can be within microwave and infrared range. Since the suggested material is low loss, the structure can be scaled down to operate higher frequencies. Thus, a sample operation wavelength is selected as 1.5μm. Although the structure employs no surface corrugations transmission value T≈0.97 can be achieved by means of diffraction order m=-1. Moreover, utilizing an extra line defect, T value can be increased upto 0.99, under oblique incidence even if the line defect layer is deeply embedded in the photonic crystal. The latter configuration can be used to obtain deflection in one frequency range and can also be utilized for the realization of another functionality like defect-mode wave guiding in another frequency range but still using the same structure.Keywords: asymmetric transmission, beam deflection, blazing, bi-directional splitting, defect layer, dual beam splitting, Floquet-Bloch modes, isofrequency contours, line defect, oblique incidence, photonic crystal, unidirectionality
Procedia PDF Downloads 262311 The Implementation of Inclusive Education in Collaboration between Teachers of Special Education Classes and Regular Classes in a Preschool
Authors: Chiou-Shiue Ko
Abstract:
As is explicitly stipulated in Article 7 of the Enforcement Rules of the Special Education Act as amended in 1998, "in principle, children with disabilities should be integrated with normal children for preschool education". Since then, all cities and counties have been committed to promoting preschool inclusive education. The Education Department, New Taipei City Government, has been actively recruiting advisory groups of professors to assist in the implementation of inclusive education in preschools since 2001. Since 2011, the author of this study has been guiding Preschool Rainbow to implement inclusive education. Through field observations, meetings, and teaching demonstration seminars, this study explored the process of how inclusive education has been successfully implemented in collaboration with teachers of special education classes and regular classes in Preschool Rainbow. The implementation phases for inclusive education in a single academic year include the following: 1) Preparatory stage. Prior to implementation, teachers in special education and regular classes discuss ways of conducting inclusive education and organize reading clubs to read books related to curriculum modifications that integrate the eight education strategies, early treatment and education, and early childhood education programs to enhance their capacity to implement and compose teaching plans for inclusive education. In addition to the general objectives of inclusive education, the objective of inclusive education for special children is also embedded into the Individualized Education Program (IEP). 2) Implementation stage. Initially, a promotional program for special education is implemented for the children to allow all the children in the preschool to understand their own special qualities and those of special children. After the implementation of three weeks of reverse inclusion, the children in the special education classes are put into groups and enter the regular classes twice a week to implement adjustments to their inclusion in the learning area and the curriculum. In 2013, further cooperation was carried out with adjacent hospitals to perform development screening activities for the early detection of children with developmental delays. 3) Review and reflection stage. After the implementation of inclusive education, all teachers in the preschool are divided into two groups to record their teaching plans and the lessons learned during implementation. The effectiveness of implementing the objective of inclusive education is also reviewed. With the collaboration of all teachers, in 2015, Preschool Rainbow won New Taipei City’s “Preschool Light” award as an exceptional model for inclusive education. Its model of implementing inclusive education can be used as a reference for other preschools.Keywords: collaboration, inclusive education, preschool, teachers, special education classes, regular classes
Procedia PDF Downloads 430310 Technology for Good: Deploying Artificial Intelligence to Analyze Participant Response to Anti-Trafficking Education
Authors: Ray Bryant
Abstract:
3Strands Global Foundation (3SGF), a non-profit with a mission to mobilize communities to combat human trafficking through prevention education and reintegration programs, launched a groundbreaking study that calls out the usage and benefits of artificial intelligence in the war against human trafficking. Having gathered more than 30,000 stories from counselors and school staff who have gone through its PROTECT Prevention Education program, 3SGF sought to develop a methodology to measure the effectiveness of the training, which helps educators and school staff identify physical signs and behaviors indicating a student is being victimized. The program further illustrates how to recognize and respond to trauma and teaches the steps to take to report human trafficking, as well as how to connect victims with the proper professionals. 3SGF partnered with Levity, a leader in no-code Artificial Intelligence (AI) automation, to create the research study utilizing natural language processing, a branch of artificial intelligence, to measure the effectiveness of their prevention education program. By applying the logic created for the study, the platform analyzed and categorized each story. If the story, directly from the educator, demonstrated one or more of the desired outcomes; Increased Awareness, Increased Knowledge, or Intended Behavior Change, a label was applied. The system then added a confidence level for each identified label. The study results were generated with a 99% confidence level. Preliminary results show that of the 30,000 stories gathered, it became overwhelmingly clear that a significant majority of the participants now have increased awareness of the issue, demonstrated better knowledge of how to help prevent the crime, and expressed an intention to change how they approach what they do daily. In addition, it was observed that approximately 30% of the stories involved comments by educators expressing they wish they’d had this knowledge sooner as they can think of many students they would have been able to help. Objectives Of Research: To solve the problem of needing to analyze and accurately categorize more than 30,000 data points of participant feedback in order to evaluate the success of a human trafficking prevention program by using AI and Natural Language Processing. Methodologies Used: In conjunction with our strategic partner, Levity, we have created our own NLP analysis engine specific to our problem. Contributions To Research: The intersection of AI and human rights and how to utilize technology to combat human trafficking.Keywords: AI, technology, human trafficking, prevention
Procedia PDF Downloads 60309 Design and Development of Permanent Magnet Quadrupoles for Low Energy High Intensity Proton Accelerator
Authors: Vikas Teotia, Sanjay Malhotra, Elina Mishra, Prashant Kumar, R. R. Singh, Priti Ukarde, P. P. Marathe, Y. S. Mayya
Abstract:
Bhabha Atomic Research Centre, Trombay is developing low energy high intensity Proton Accelerator (LEHIPA) as pre-injector for 1 GeV proton accelerator for accelerator driven sub-critical reactor system (ADSS). LEHIPA consists of RFQ (Radio Frequency Quadrupole) and DTL (Drift Tube Linac) as major accelerating structures. DTL is RF resonator operating in TM010 mode and provides longitudinal E-field for acceleration of charged particles. The RF design of drift tubes of DTL was carried out to maximize the shunt impedance; this demands the diameter of drift tubes (DTs) to be as low as possible. The width of the DT is however determined by the particle β and trade-off between a transit time factor and effective accelerating voltage in the DT gap. The array of Drift Tubes inside DTL shields the accelerating particle from decelerating RF phase and provides transverse focusing to the charged particles which otherwise tends to diverge due to Columbic repulsions and due to transverse e-field at entry of DTs. The magnetic lenses housed inside DTS controls the transverse emittance of the beam. Quadrupole magnets are preferred over solenoid magnets due to relative high focusing strength of former over later. The availability of small volume inside DTs for housing magnetic quadrupoles has motivated the usage of permanent magnet quadrupoles rather than Electromagnetic Quadrupoles (EMQ). This provides another advantage as joule heating is avoided which would have added thermal loaded in the continuous cycle accelerator. The beam dynamics requires uniformity of integral magnetic gradient to be better than ±0.5% with the nominal value of 2.05 tesla. The paper describes the magnetic design of the PMQ using Sm2Co17 rare earth permanent magnets. The paper discusses the results of five pre-series prototype fabrications and qualification of their prototype permanent magnet quadrupoles and a full scale DT developed with embedded PMQs. The paper discusses the magnetic pole design for optimizing integral Gdl uniformity and the value of higher order multipoles. A novel but simple method of tuning the integral Gdl is discussed.Keywords: DTL, focusing, PMQ, proton, rate earth magnets
Procedia PDF Downloads 472308 Shaping of World-Class Delhi: Politics of Marginalization and Inclusion
Authors: Aparajita Santra
Abstract:
In the context of the government's vision of turning Delhi into a green, privatized and slum free city, giving it a world-class image at par with the global cities of the world, this paper investigates into the various processes and politics of things that went behind defining spaces in the city and attributing an aesthetic image to it. The paper will explore two cases that were forged primarily through the forces of one particular type of power relation. One would be to look at the modernist movement adopted by the Nehruvian government post-independence and the next case will look at special periods like Emergency and Commonwealth games. The study of these cases will help understand the ambivalence embedded in the different rationales of the Government and different powerful agencies adopted in order to build world-classness. Through the study, it will be easier to discern how city spaces were reconfigured in the name of 'good governance'. In this process, it also became important to analyze the double nature of law, both as a protector of people’s rights and as a threat to people. What was interesting to note through the study was that in the process of nation building and creating an image for the city, the government’s policies and programs were mostly aimed at the richer sections of the society and the poorer sections and people from lower income groups kept getting marginalized, subdued, and pushed further away (These marginalized people were pushed away even geographically!). The reconfiguration of city space and attributing an aesthetic character to it, led to an alteration not only in the way in which citizens perceived and engaged with these spaces, but also brought about changes in the way they envisioned their place in the city. Ironically, it was found that every attempt to build any kind of facility for the city’s elite in turn led to an inevitable removal of the marginalized sections of the society as a necessary step to achieve a clean, green and world-class city. The paper questions the claim made by the government for creating a just, equitable city and granting rights to all. An argument is put forth that in the politics of redistribution of space, the city that has been designed is meant for the aspirational middle-class and elite only, who are ideally primed to live in world-class cities. Thus, the aim is to study city spaces, urban form, the associated politics and power plays involved within and understand whether segmented cities are being built in the name of creating sensible, inclusive cities.Keywords: aesthetics, ambivalence, governmentality, power, World-class
Procedia PDF Downloads 119307 An Exploratory Study on the Impact of Video-stimulated Reflection on Novice EFL Teachers’ Professional Development
Authors: Ibrahima Diallo
Abstract:
The literature on teacher education foregrounds reflection as an important aspect of professional practice. Reflection for a teacher consists in critically analysing and evaluating retrospectively a lesson to see what worked, what did not work, and how to improve it for the future. Now, many teacher education programmes worldwide consider the ability to reflect as one of the hallmarks of an effective educator. However, in some context like Senegal, reflection has not been given due consideration in teacher education programmes. In contexts where it has been in the education landscape for some time now, reflection is mostly depicted as an individual written activity and many teacher trainees have become disenchanted by the repeated enactments of this task that is solely intended to satisfy course requirements. This has resulted in whitewashing weaknesses or even ‘faking’ reflection. Besides, the “one-size-fits-all” approach of reflection could not flourish because how reflection impacts on practice is still unproven. Therefore, reflective practice needs to be contextualised and made more thought-provoking through dialogue and by using classroom data. There is also a need to highlight change brought in teachers’ practice through reflection. So, this study introduces reflection in a new context and aims to show evidenced change in novice EFL teachers’ practice through dialogic data-led reflection. The purpose of this study is also to contribute to the scarce literature on reflection in sub-Saharan Africa by bringing new perspectives on contextualised teacher-led reflection. Eight novice EFL teachers participated in this qualitative longitudinal study, and data have been gathered online through post-lesson reflection recordings and lesson videos for a period of four months. Then, the data have been thematically analysed using NVivo to systematically organize and manage the large amount of data. The analysis followed the six steps approach to thematic analysis. Major themes related to teachers’ classroom practice and their conception of reflection emerged from the analysis of the data. The results showed that post-lesson reflection with a peer can help novice EFL teachers gained more awareness on their classroom practice. Dialogic reflection also helped them evaluate their lessons and seek for improvement. The analysis of the data also gave insight on teachers’ conception of reflection in an EFL context. It was found that teachers were more engaged in reflection when using their lesson video recordings. Change in teaching behaviour as a result of reflection was evidenced by the analysis of the lesson video recordings. This study has shown that video-stimulated reflection is practical form of professional development that can be embedded in teachers’ professional life.Keywords: novice EFL teachers, practice, professional development, video-stimulated reflection
Procedia PDF Downloads 100306 Agroecology: Rethink the Local in the Global to Promote the Creation of Novelties
Authors: Pauline Cuenin, Marcelo Leles Romarco Oliveira
Abstract:
Based on their localities and following their ecological rationality, family-based farmers have experimented, adapted and innovated to improve their production systems continuously for millennia. With the technological package transfer processes of the so-called Green Revolution for agricultural holdings, farmers have become increasingly dependent on ready-made "recipes" built from so-called "universal" and global knowledge to face the problems that emerge in the management of local agroecosystems, thus reducing their creative and experiential capacities. However, the production of novelties within farms is fundamental to the transition to more sustainable agro food systems. In fact, as the fruits of local knowledge and / or the contextualization of exogenous knowledge, novelties are seen as seeds of transition. By presenting new techniques, new organizational forms and epistemological approaches, agroecology was pointed out as a way to encourage and promote the creative capacity of farmers. From this perspective, this theoretical work aims to analyze how agroecology encourages the innovative capacity of farmers, and in general, the production of novelties. For this, an analysis was made of the theoretical and methodological bases of agroecology through a literature review, specifically looking for the way in which it articulates the local with the global, complemented by an analysis of agro ecological Brazilian experiences. It was emphasized that, based on the peasant way of doing agriculture, that is, on ecological / social co-evolution or still called co-production (interaction between human beings and living nature), agroecology recognizes and revalues peasant involves the deep interactions of the farmer with his site (bio-physical and social). As a "place science," practice and movement, it specifically takes into consideration the local and empirical knowledge of farmers, which allows questioning and modifying the paradigms that underpin the current agriculture that have disintegrated farmers' creative processes. In addition to upgrade the local, agroecology allows the dialogue of local knowledge with global knowledge, essential in the process of changes to get out of the dominant logic of thought and give shape to new experiences. In order to reach this articulation, agroecology involves new methodological focuses seeking participatory methods of study and intervention that express themselves in the form of horizontal spaces of socialization and collective learning that involve several actors with different knowledge. These processes promoted by agroecology favor the production of novelties at local levels for expansion at other levels, such as the global, through trans local agro ecological networks.Keywords: agroecology, creativity, global, local, novelty
Procedia PDF Downloads 225305 Advances in Machine Learning and Deep Learning Techniques for Image Classification and Clustering
Authors: R. Nandhini, Gaurab Mudbhari
Abstract:
Ranging from the field of health care to self-driving cars, machine learning and deep learning algorithms have revolutionized the field with the proper utilization of images and visual-oriented data. Segmentation, regression, classification, clustering, dimensionality reduction, etc., are some of the Machine Learning tasks that helped Machine Learning and Deep Learning models to become state-of-the-art models for the field where images are key datasets. Among these tasks, classification and clustering are essential but difficult because of the intricate and high-dimensional characteristics of image data. This finding examines and assesses advanced techniques in supervised classification and unsupervised clustering for image datasets, emphasizing the relative efficiency of Convolutional Neural Networks (CNNs), Vision Transformers (ViTs), Deep Embedded Clustering (DEC), and self-supervised learning approaches. Due to the distinctive structural attributes present in images, conventional methods often fail to effectively capture spatial patterns, resulting in the development of models that utilize more advanced architectures and attention mechanisms. In image classification, we investigated both CNNs and ViTs. One of the most promising models, which is very much known for its ability to detect spatial hierarchies, is CNN, and it serves as a core model in our study. On the other hand, ViT is another model that also serves as a core model, reflecting a modern classification method that uses a self-attention mechanism which makes them more robust as this self-attention mechanism allows them to lean global dependencies in images without relying on convolutional layers. This paper evaluates the performance of these two architectures based on accuracy, precision, recall, and F1-score across different image datasets, analyzing their appropriateness for various categories of images. In the domain of clustering, we assess DEC, Variational Autoencoders (VAEs), and conventional clustering techniques like k-means, which are used on embeddings derived from CNN models. DEC, a prominent model in the field of clustering, has gained the attention of many ML engineers because of its ability to combine feature learning and clustering into a single framework and its main goal is to improve clustering quality through better feature representation. VAEs, on the other hand, are pretty well known for using latent embeddings for grouping similar images without requiring for prior label by utilizing the probabilistic clustering method.Keywords: machine learning, deep learning, image classification, image clustering
Procedia PDF Downloads 17304 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability
Procedia PDF Downloads 208303 Assessment of Environmental Risk Factors of Railway Using Integrated ANP-DEMATEL Approach in Fuzzy Conditions
Authors: Mehrdad Abkenari, Mehmet Kunt, Mahdi Nourollahi
Abstract:
Evaluating the environmental risk factors is a combination of analysis of transportation effects. Various definitions for risk can be found in different scientific sources. Each definition depends on a specific and particular perspective or dimension. The effects of potential risks present along the new proposed routes and existing infrastructures of large transportation projects like railways should be studied under comprehensive engineering frameworks. Despite various definitions provided for ‘risk’, all include a uniform concept. Two obvious aspects, loss and unreliability, have always been pointed in all definitions of this term. But, selection as the third aspect is usually implied and means how one notices it. Currently, conducting engineering studies on the environmental effects of railway projects have become obligatory according to the Environmental Assessment Act in developing countries. Considering the longitudinal nature of these projects and probable passage of railways through various ecosystems, scientific research on the environmental risk of these projects have become of great interest. Although many areas of expertise such as road construction in developing countries have not seriously committed to these studies yet, attention to these subjects in establishment or implementation of different systems have become an inseparable part of this wave of research. The present study used environmental risks identified and existing in previous studies and stations to use in next step. The second step proposes a new hybrid approach of analytical network process (ANP) and DEMATEL in fuzzy conditions for assessment of determined risks. Since evaluation of identified risks was not an easy touch, mesh structure was an appropriate approach for analyzing complex systems which were accordingly employed for problem description and modeling. Researchers faced the shortage of real space data and also due to the ambiguity of experts’ opinions and judgments, they were declared in language variables instead of numerical ones. Since fuzzy logic is appropriate for ambiguity and uncertainty, formulation of experts’ opinions in the form of fuzzy numbers seemed an appropriate approach. Fuzzy DEMATEL method was used to extract the relations between major and minor risk factors. Considering the internal relations of risk major factors and its sub-factors in the analysis of fuzzy network, the weight of risk’s main factors and sub-factors were determined. In general, findings of the present study, in which effective railway environmental risk indicators were theoretically identified and rated through the first usage of combined model of DEMATEL and fuzzy network analysis, indicate that environmental risks can be evaluated more accurately and also employed in railway projects.Keywords: DEMATEL, ANP, fuzzy, risk
Procedia PDF Downloads 415302 Efficient Field-Oriented Motor Control on Resource-Constrained Microcontrollers for Optimal Performance without Specialized Hardware
Authors: Nishita Jaiswal, Apoorv Mohan Satpute
Abstract:
The increasing demand for efficient, cost-effective motor control systems in the automotive industry has driven the need for advanced, highly optimized control algorithms. Field-Oriented Control (FOC) has established itself as the leading approach for motor control, offering precise and dynamic regulation of torque, speed, and position. However, as energy efficiency becomes more critical in modern applications, implementing FOC on low-power, cost-sensitive microcontrollers pose significant challenges due to the limited availability of computational and hardware resources. Currently, most solutions rely on high-performance 32-bit microcontrollers or Application-Specific Integrated Circuits (ASICs) equipped with Floating Point Units (FPUs) and Hardware Accelerated Units (HAUs). These advanced platforms enable rapid computation and simplify the execution of complex control algorithms like FOC. However, these benefits come at the expense of higher costs, increased power consumption, and added system complexity. These drawbacks limit their suitability for embedded systems with strict power and budget constraints, where achieving energy and execution efficiency without compromising performance is essential. In this paper, we present an alternative approach that utilizes optimized data representation and computation techniques on a 16-bit microcontroller without FPUs or HAUs. By carefully optimizing data point formats and employing fixed-point arithmetic, we demonstrate how the precision and computational efficiency required for FOC can be maintained in resource-constrained environments. This approach eliminates the overhead performance associated with floating-point operations and hardware acceleration, providing a more practical solution in terms of cost, scalability and improved execution time efficiency, allowing faster response in motor control applications. Furthermore, it enhances system design flexibility, making it particularly well-suited for applications that demand stringent control over power consumption and costs.Keywords: field-oriented control, fixed-point arithmetic, floating point unit, hardware accelerator unit, motor control systems
Procedia PDF Downloads 20301 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'
Authors: Anthony Coogan
Abstract:
Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle
Procedia PDF Downloads 200300 Contestation and Coexistence: An Exploratory Study of the Interactions between Formal and Informal Sectors within eThekwini City Centre
Authors: Mulaudzi Tshimbiluni Annah
Abstract:
South African city centres depict dynamic urban spaces which reflect complex interactions between multiple actors: the state, formal businesses and informal street traders, with each competing for territorial claims and spatial dominance. The objective of the study is exploring how space is contested, negotiated and occupied between formal and informal sectors, while consequently trying to understand the implication that this has on spatial planning and spatial justice. Through a case-study analysis of the eThekwini city centre, this research examines spatial arrangement, coexistence and conflicts that shape the urban fabric. The study employs spatial justice as a theoretical lens to highlight the inequalities that are embedded within urban planning policies and how street traders are resilient to the harsh restrictive spatial frameworks. Limited evidence is known about how urban planning frameworks can integrate informal street traders in city centres and recognize them as legitimate stakeholders. The study investigates how spatial planning frameworks can be reimagined to promote spatial justice and further facilitate coexistence between formal and informal stakeholders in city centres. Primary data collection included interviews with key stakeholders, while NVivo software was used to analyse the interview data. Observations were conducted through transect walks, which allowed for insight into the spatial dynamics and daily interactions. Visual representations were depicted using GIS mapping to show areas of contestation as well as areas where formal and informal activities intersect. Furthermore, secondary data from literature enabled a comparative analysis of similar case studies through precedent studies. The study revealed continuous contestation by formal businesses and the state, who are for the most part often prioritized by planning frameworks while street traders are often marginalized regardless of their contribution towards economic development. This study therefore proposes strategies for spatial planning that supports an integrative urban framework which ensures equitable access and also a reduction of the marginalization of street traders within urban spaces. This study aims to contribute to understanding urban coexistence and further advocates for spatial planning approaches that integrates informal street traders as legitimate actors in the urban landscape while fostering spatial justice within city centres.Keywords: coexistence, contestation, integration, spatial justice, spatial planning, street traders
Procedia PDF Downloads 16299 Being Reticent for Healing – Singularity and Non-Verbalization in Indigenous Medical Practices in Sri Lanka
Authors: Ayami Umemura
Abstract:
The purpose of this paper is to examine the meaning of verbalization in clinical practice using the keywords silence and singularity. A patient's experience of illness and treatment is singular, irreplaceable, and irreproducible and ultimately cannot be compared with that of others. In his book Difference and Repetition, Gilles Deleuze positioned irreplaceable singularity as the opposite concept of particularity as a generalizable and substitutable property and matched the former with universality. He also said that singularity could not be represented because of its irreplaceable nature. Representation or verbalization is a procedure that converts an irreplaceable, idiosyncratic reality into something that can be substituted. Considering the act of verbalizing medical diagnosis based on this, it can be said that diagnosis is the practice of decontextualizing and generalizing the suffering embedded in the patient's irreplaceable life history as a disease. This paper examines the above with the key concept of the practice of "non-verbalization" in traditional medical practices in Sri Lanka. In the practice of Sri Lankan traditional medicine and the inheritance of medical knowledge and care techniques, there is a tendency to avoid verbalizing specific matters or stating them aloud. Specifically, the following should be avoided. The healer informs the patient of the name of the disease, mentions the name of the herb used in front of the patient, explains the patient's condition to the healer, and referring the names of poisonous animals, such as poisonous snakes that have been damaged. And so on. Furthermore, when passing on medical knowledge and skills, it is also possible to avoid verbalizing knowledge of medicinal herbs and medical treatment methods and explaining them verbally. In addition to the local belief that the soul of language in Sri Lanka is deeply involved in this background, Sri Lankan traditional medicine has a unique view of the human body and personality that is rooted in the singularity that appears in the relationship with the movement of celestial bodies and the supernatural realm. It can be pointed out that it is premised on the view. In other words, the “silence” in Sri Lankan indigenous medicine is the reason for emphasizing specificity. Furthermore, we can say that "non-verbalization" is a practice aimed at healing. Based on these discussions, this paper will focus on the unique relationships between practitioners and patients that become invisible due to verbalization, which is overlooked by clinical medicine, where informed consent, ensuring transparency, and audit culture is dominant. We will examine the experience of treatment and aim to relativize clinical medicine, which is based on audit cultures.Keywords: audit cultures, indigenous medicine, singularity, verbalization
Procedia PDF Downloads 88298 An Inexhaustible Will of Infinite, or the Creative Will in the Psychophysiological Artistic Practice: An Analysis through Nietzsche's Will to Power
Authors: Filipa Cruz, Grecia P. Matos
Abstract:
An Inexhaustible Will of Infinite is ongoing practice-based research focused on a psychophysiological conception of body and on the creative will that seeks to examine the possibility of art being simultaneously a pacifier and an intensifier in a physiological artistic production. This is a study where philosophy and art converge in a commentary on the affection of the concept of will to power in the art world through Nietzsche’s commentaries, through the analysis of case studies and a reflection arising from artistic practice. Through Nietzsche, it is sought to compare concepts that communicate with the artistic practice since creation is an intensification and engenders perspectives. It is also a practice highly embedded in the body, in the non-verbal, in the physiology of art and in the coexistence between the sensorial and the thought. It is questioned if the physiology of art could be thought of as a thinking-feeling with no primacy of the thought over the sensorial. Art as a manifestation of the will to power participates in a comprehension of the world. In this article, art is taken as a privileged way of communication – implicating corporeal-sensorial-conceptual – and of connection between humans. Problematized is the dream and the drunkenness as intensifications and expressions of life’s comprehension. Therefore, art is perceived as suggestion and invention, where the artistic intoxication breaks limits in the experience of life, and the artist, dominated by creative forces, claims, orders, obeys, proclaims love for life. The intention is also to consider how one can start from pain to create and how one can generate new and endless artistic forms through nightmares, daydreams, impulses, intoxication, enhancement, intensification in a plurality of subjects and matters. It is taken into consideration the fact that artistic creation is something that is intensified corporeally, expanded, continuously generated and acting on bodies. It is inextinguishable and a constant movement intertwining Apollonian and Dionysian instincts of destruction and creation of new forms. The concept of love also appears associated with conquering, that, in a process of intensification and drunkenness, impels the artist to generate and to transform matter. Just like a love relationship, love in Nietzsche requires time, patience, effort, courage, conquest, seduction, obedience, and command, potentiating the amplification of knowledge of the other / the world. Interlacing Nietzsche's philosophy, not with Modern Art, but with Contemporary Art, it is argued that intoxication, will to power (strongly connected with the creative will) and love still have a place in the artistic production as creative agents.Keywords: artistic creation, body, intensification, psychophysiology, will to power
Procedia PDF Downloads 121297 Automated, Objective Assessment of Pilot Performance in Simulated Environment
Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt
Abstract:
Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).Keywords: automated assessment, flight simulator, human factors, pilot training
Procedia PDF Downloads 150296 Tunnel Convergence Monitoring by Distributed Fiber Optics Embedded into Concrete
Authors: R. Farhoud, G. Hermand, S. Delepine-lesoille
Abstract:
Future underground facility of French radioactive waste disposal, named Cigeo, is designed to store intermediate and high level - long-lived French radioactive waste. Intermediate level waste cells are tunnel-like, about 400m length and 65 m² section, equipped with several concrete layers, which can be grouted in situ or composed of tunnel elements pre-grouted. The operating space into cells, to allow putting or removing waste containers, should be monitored for several decades without any maintenance. To provide the required information, design was performed and tested in situ in Andra’s underground laboratory (URL) at 500m under the surface. Based on distributed optic fiber sensors (OFS) and backscattered Brillouin for strain and Raman for temperature interrogation technics, the design consists of 2 loops of OFS, at 2 different radiuses, around the monitored section (Orthoradiale strains) and longitudinally. Strains measured by distributed OFS cables were compared to classical vibrating wire extensometers (VWE) and platinum probes (Pt). The OFS cables were composed of 2 cables sensitive to strains and temperatures and one only for temperatures. All cables were connected, between sensitive part and instruments, to hybrid cables to reduce cost. The connection has been made according to 2 technics: splicing fibers in situ after installation or preparing each fiber with a connector and only plugging them together in situ. Another challenge was installing OFS cables along a tunnel mad in several parts, without interruption along several parts. First success consists of the survival rate of sensors after installation and quality of measurements. Indeed, 100% of OFS cables, intended for long-term monitoring, survived installation. Few new configurations were tested with relative success. Measurements obtained were very promising. Indeed, after 3 years of data, no difference was observed between cables and connection methods of OFS and strains fit well with VWE and Pt placed at the same location. Data, from Brillouin instrument sensitive to strains and temperatures, were compensated with data provided by Raman instrument only sensitive to temperature and into a separated fiber. These results provide confidence in the next steps of the qualification processes which consists of testing several data treatment approach for direct analyses.Keywords: monitoring, fiber optic, sensor, data treatment
Procedia PDF Downloads 130295 Fabric-Reinforced Cementitious Matrix (FRCM)-Repaired Corroded Reinforced Concrete (RC) Beams under Monotonic and Fatigue Loads
Authors: Mohammed Elghazy, Ahmed El Refai, Usama Ebead, Antonio Nanni
Abstract:
Rehabilitating corrosion-damaged reinforced concrete (RC) structures has been accomplished using various techniques such as steel plating, external post-tensioning, and external bonding of fiber reinforced polymer (FRP) composites. This paper reports on the use of an innovative technique to strengthen corrosion-damaged RC structures using fabric-reinforced cementitious matrix (FRCM) composites. FRCM consists of dry-fiber fabric embedded in cement-based matrix. Twelve large-scale RC beams were constructed and tested in flexural monotonic and fatigue loads. Prior to testing, ten specimens were subjected to accelerated corrosion process for 140 days leading to an average mass loss in the tensile steel bars of 18.8 %. Corrosion was restricted to the main reinforcement located in the middle third of the beam span. Eight corroded specimens were repaired and strengthened while two virgin and two corroded-unrepaired/unstrengthened beams were used as benchmarks for comparison purpose. The test parameters included the FRCM materials (Carbon-FRCM, PBO-FRCM), the number of FRCM plies, the strengthening scheme, and the type of loading (monotonic and fatigue). The effects of the pervious parameters on the flexural response, the mode of failure, and the fatigue life were reported. Test results showed that corrosion reduced the yield and ultimate strength of the beams. The corroded-unrepaired specimen failed to meet the provisions of the ACI-318 code for crack width criteria. The use of FRCM significantly increased the ultimate strength of the corroded specimen by 21% and 65% more than that of the corroded-unrepaired specimen. Corrosion significantly decreased the fatigue life of the corroded-unrepaired beam by 77% of that of the virgin beam. The fatigue life of the FRCM repaired-corroded beams increased to 1.5 to 3.8 times that of the corroded-unrepaired beam but was lower than that of the virgin specimen. The specimens repaired with U-wrapped PBO-FRCM strips showed higher fatigue life than those repaired with the end-anchored bottom strips having similar number of PBO-FRCM-layers. PBO-FRCM was more effective than Carbon-FRCM in restoring the fatigue life of the corroded specimens.Keywords: corrosion, concrete, fabric-reinforced cementitious matrix (FRCM), fatigue, flexure, repair
Procedia PDF Downloads 296294 Profile of Programmed Death Ligand-1 (PD-L1) Expression and PD-L1 Gene Amplification in Indonesian Colorectal Cancer Patients
Authors: Akterono Budiyati, Gita Kusumo, Teguh Putra, Fritzie Rexana, Antonius Kurniawan, Aru Sudoyo, Ahmad Utomo, Andi Utama
Abstract:
The presence of the programmed death ligand-1 (PD-L1) has been used in multiple clinical trials and approved as biomarker for selecting patients more likely to respond to immune checkpoint inhibitors. However, the expression of PD-L1 is regulated in different ways, which leads to a different significance of its presence. Positive PD-L1 within tumors may result from two mechanisms, induced PD-L1 expression by T-cell presence or genetic mechanism that lead to constitutive PD-L1 expression. Amplification of PD-L1 genes was found as one of genetic mechanism which causes an increase in PD-L1 expression. In case of colorectal cancer (CRC), targeting immune checkpoint inhibitor has been recommended for patients with microsatellite instable (MSI). Although the correlation between PD-L1 expression and MSI status has been widely studied, so far the precise mechanism of PD-L1 gene activation in CRC patients, particularly in MSI population have yet to be clarified. In this present study we have profiled 61 archived formalin fixed paraffin embedded CRC specimens of patients from Medistra Hospital, Jakarta admitted in 2010 - 2016. Immunohistochemistry was performed to measure expression of PD-L1 in tumor cells as well as MSI status using antibodies against PD-L1 and MMR (MLH1, MSH2, PMS2 and MSH6), respectively. PD-L1 expression was measured on tumor cells with cut off of 1% whereas loss of nuclear MMR protein expressions in tumor cells but not in normal or stromal cells indicated presence of MSI. Subset of PD-L1 positive patients was then assessed for copy number variations (CNVs) using single Tube TaqMan Copy Number Assays Gene CD247PD-L1. We also observed KRAS mutation to profile possible genetic mechanism leading to the presence or absence of PD-L1 expression. Analysis of 61 CRC patients revealed 15 patients (24%) expressed PD-L1 on their tumor cell membranes. The prevalence of surface membrane PD-L1 was significantly higher in patients with MSI (87%; 7/8) compared to patients with microsatellite stable (MSS) (15%; 8/53) (P=0.001). Although amplification of PD-L1 gene was not found among PD-L1 positive patients, low-level amplification of PD-L1 gene was commonly observed in MSS patients (75%; 6/8) than in MSI patients (43%; 3/7). Additionally, we found 26% of CRC patients harbored KRAS mutations (16/61), so far the distribution of KRAS status did not correlate with PD-L1 expression. Our data suggest genetic mechanism through amplification of PD-L1 seems not to be the mechanism underlying upregulation of PD-L1 expression in CRC patients. However, further studies are warranted to confirm the results.Keywords: colorectal cancer, gene amplification, microsatellite instable, programmed death ligand-1
Procedia PDF Downloads 223293 Cultural Statistics in Governance: A Comparative Analysis between the UK and Finland
Authors: Sandra Toledo
Abstract:
There is an increasing tendency in governments for a more evidence-based policy-making and a stricter auditing of public spheres. Especially when budgets are tight, and taxpayers demand a bigger scrutiny over the use of the available resources, statistics and numbers appeared as an effective tool to produce data that supports investments done, as well as evaluating public policy performance. This pressure has not exempted the cultural and art fields. Finland like the rest of Nordic countries has kept its principles from the welfare state, whilst UK seems to be going towards the opposite direction, relaying more and more in private sectors and foundations, as the state folds back. The boom of the creative industries along with a managerial trend introduced by Tatcher in the UK brought, as a result, a commodification of arts within a market logic, where sponsorship and commercial viability were the keynotes. Finland on its part, in spite of following a more protectionist approach of arts, seems to be heading in a similar direction. Additionally, there is an international growing interest in the application of cultural participation studies and the comparability between countries in their results. Nonetheless, the standardization in the application of cultural surveys has not happened yet. Not only there are differences in the application of these type of surveys in terms of time and frequency, but also regarding those conducting them. Therefore, one hypothesis considered in this research is that behind the differences between countries in the application of cultural surveys, production and utilization of cultural statistics is the cultural policy model adopted by the government. In other words, the main goal of this research is to answer the following: What are the differences and similarities between Finland and the UK regarding the role cultural surveys have in cultural policy making? Along with other secondary questions such as: How does the cultural policy model followed by each country influence the role of cultural surveys in cultural policy making? and what are the differences at the local level? In order to answer these questions, strategic cultural policy documents and interviews with key informants will be used and analyzed as source data, using content analysis methods. Cultural statistics per se will not be compared, but instead their use as instruments of governing, and its relation to the cultural policy model. Aspects such as execution of cultural surveys, funding, periodicity, and use of statistics in formal reports and publications, will be studied in the written documents while in the interviews other elements such as perceptions from those involved in collecting cultural statistics or policy making, distribution of tasks and hierarchies among cultural and statistical institutions, and a general view will be the target. A limitation identified beforehand and that it is expected to encounter throughout the process is the language barrier in the case of Finland when it comes to official documents, which will be tackled by interviewing the authors of such papers and choosing key extract of them for translation.Keywords: Finland, cultural statistics, cultural surveys, United Kingdom
Procedia PDF Downloads 236