Search results for: Marcos Valdez Alexander Junior
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 586

Search results for: Marcos Valdez Alexander Junior

226 Microstructure Study of Melt Spun Mg₆₅Cu₂₅Y₁₀

Authors: Michael Regev, Shai Essel, Alexander Katz-Demyanetz

Abstract:

Magnesium alloys are characterized by good physical properties: They exhibit high strength, are lightweight and have good damping absorption and good thermal and electrical conductivity. Amorphous magnesium alloys, moreover, exhibit higher strength, hardness and a large elastic domain in addition to having excellent corrosion resistance. These above-mentioned advantages make magnesium based metallic glasses attractive for industrial use. Among the various existing magnesium alloys, Mg₆₅Cu₂₅Y₁₀ alloy is known to be one of the best glass formers. In the current study, Mg₆₅Cu₂₅Y₁₀ ribbons were produced by melt spinning, their microstructure was investigated in its as-cast condition, after pressing under 0.5 GPa for 5 minutes under different temperatures - RT, 500C, 1000C, 1500C and 2000C - and after five minute exposure to the above temperatures without pressing. The microstructure was characterized by means of X-ray Diffraction (XRD), Differential Scanning Calorimetry (DSC), High Resolution Scanning Electron Microscope (HRSEM) and High Resolution Transmission Electron Microscopy (HRTEM). XRD and DSC studies showed that the as-cast material had an amorphous character and that the material crystallized during exposure to temperature with or without applying stress. HRTEM revealed that the as-cast Mg65Cu25Y10, although known to be one of the best glass formers, is nano-crystalline rather than amorphous. The current study casts light on the question what an amorphous alloy is and whether there is any clear borderline between amorphous and nano-crystalline alloys.

Keywords: metallic glass, magnesium, melt spinning, amorphous alloys

Procedia PDF Downloads 211
225 The Impact of Music on Social Identity Formation and Intergroup Relations in American-Born Korean Skaters in 2018 Winter Olympics

Authors: Sehwan Kim, Jepkorir Rose Chepyator Thomson

Abstract:

Music provides opportunities to affirm social identities and facilitate the internalization of one’s identity. The purpose of this study was to examine the role of music in breaking down boundaries between the in-group and out-of-group sport participants. Social identity theory was used to guide an understanding of two American-born South Korean skaters—Yura Min and Alexander Gamelin—who used a Korean representative traditional folk song, Arirang, at the 2018 Winter Olympics. This was an interpretive case study that focused on 2018 Winter Olympic participants whose performance and use of music was understood through the lenses of Koreans. Semi-structured interviews were conducted with 15 Korean audiences who watched two American-born South Korean skaters’ performances. Data analysis involved the determination of themes in the data collected. The findings of this study are as follows: First Koreans viewed the skaters as the out-group based on ethnic appearances and stereotypes. Second, Koreans’ inter-group bias against the skaters was meditated after Koreans watched the skaters as they used Arirang song in performance. Implications for this study include the importance of music as an instrument of unity across diverse populations, including intergroup relations. Music can also offer ways to understand people’s cultures and bridge gaps between age and gender across categories of naturalization.

Keywords: impact of music, intergroup relations, naturalized athletes, social identity theory

Procedia PDF Downloads 187
224 Thrombocytopenia and Prolonged Prothrombin Time in Neonatal Septicemia

Authors: Shittu Bashirat, Shittu Mujeeb, Oluremi Adeolu, Orisadare Olayiwola, Jikeme Osameke, Bello Lateef

Abstract:

Septicemia in neonates refers to generalized bacterial infection documented by positive blood culture in the first 28 days of life and is one of the leading causes of neonatal mortality in sub-Sahara Africa. Thrombocytopenia in newborns is a result of increased platelet consumption; sepsis was found to be the most common risk factor. The objective of the study was to determine if there are organism-specific platelet responses among the 2 groups of bacterial agents: Gram-positive and Gram-negative bacteria, and also to examine the association of platelet count and prothrombin time with neonatal septicemia. 232 blood samples were collected for this study. The blood culture was performed using Bactec 9050, an instrumented blood culture system. The platelet count and prothrombin time were performed using Abacus Junior 5 hematology analyzer and i-STAT 1 analyzer respectively. Of the 231 neonates hospitalized with clinical sepsis, blood culture reports were positive in 51 cases (21.4%). Klebsiella spp. (35.3%) and Staphylococcus aureus (27.5%) were the most common Gram-negative and Gram-positive isolates respectively. Thrombocytopenia was observed in 30 (58.8%) of the neonates with septicemia. Of the 9 (17.6%) patients with severe thrombocytopenia, seven (77.8%) had Klebsiella spp. septicemia. Out of the 21(63.6%) of thrombocytopenia produced by Gram-negative isolate, 17 (80.9) had increased prothrombin time. In conclusion, Gram-negative organisms showed the highest cases of severe thrombocytopenia and prolonged PT. This study has helped to establish a disturbance in hemostatic systems in neonates with septicemia. Further studies, however, may be required to assess other hemostasis parameters in order to understand their interaction with the infectious organisms in neonates.

Keywords: neonates, septicemia, thrombocytopenia, prolonged prothrombin time, platelet count

Procedia PDF Downloads 379
223 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet

Procedia PDF Downloads 282
222 The Low-Cost Design and 3D Printing of Structural Knee Orthotics for Athletic Knee Injury Patients

Authors: Alexander Hendricks, Sean Nevin, Clayton Wikoff, Melissa Dougherty, Jacob Orlita, Rafiqul Noorani

Abstract:

Knee orthotics play an important role in aiding in the recovery of those with knee injuries, especially athletes. However, structural knee orthotics is often very expensive, ranging between $300 and $800. The primary reason for this project was to answer the question: can 3D printed orthotics represent a viable and cost-effective alternative to present structural knee orthotics? The primary objective for this research project was to design a knee orthotic for athletes with knee injuries for a low-cost under $100 and evaluate its effectiveness. The initial design for the orthotic was done in SolidWorks, a computer-aided design (CAD) software available at Loyola Marymount University. After this design was completed, finite element analysis (FEA) was utilized to understand how normal stresses placed upon the knee affected the orthotic. The knee orthotic was then adjusted and redesigned to meet a specified factor-of-safety of 3.25 based on the data gathered during FEA and literature sources. Once the FEA was completed and the orthotic was redesigned based from the data gathered, the next step was to move on to 3D-printing the first design of the knee brace. Subsequently, physical therapy movement trials were used to evaluate physical performance. Using the data from these movement trials, the CAD design of the brace was refined to accommodate the design requirements. The final goal of this research means to explore the possibility of replacing high-cost, outsourced knee orthotics with a readily available low-cost alternative.

Keywords: 3D printing, knee orthotics, finite element analysis, design for additive manufacturing

Procedia PDF Downloads 151
221 Diagnostic Value of Different Noninvasive Criteria of Latent Myocarditis in Comparison with Myocardial Biopsy

Authors: Olga Blagova, Yuliya Osipova, Evgeniya Kogan, Alexander Nedostup

Abstract:

Purpose: to quantify the value of various clinical, laboratory and instrumental signs in the diagnosis of myocarditis in comparison with morphological studies of the myocardium. Methods: in 100 patients (65 men, 44.7±12.5 years) with «idiopathic» arrhythmias (n = 20) and dilated cardiomyopathy (DCM, n = 80) were performed 71 endomyocardial biopsy (EMB), 13 intraoperative biopsy, 5 study of explanted hearts, 11 autopsy with virus investigation (real-time PCR) of the blood and myocardium. Anti-heart antibodies (AHA) were also measured as well as cardiac CT (n = 45), MRI (n = 25), coronary angiography (n = 47). The comparison group included of 50 patients (25 men, 53.7±11.7 years) with non-inflammatory heart diseases who underwent open heart surgery. Results. Active/borderline myocarditis was diagnosed in 76.0% of the study group and in 21.6% of patients of the comparison group (p < 0.001). The myocardial viral genome was observed more frequently in patients of comparison group than in study group (group (65.0% and 40.2%; p < 0.01. Evaluated the diagnostic value of noninvasive markers of myocarditis. The panel of anti-heart antibodies had the greatest importance to identify myocarditis: sensitivity was 81.5%, positive and negative predictive value was 75.0 and 60.5%. It is defined diagnostic value of non-invasive markers of myocarditis and diagnostic algorithm providing an individual assessment of the likelihood of myocarditis is developed. Conclusion. The greatest significance in the diagnosis of latent myocarditis in patients with 'idiopathic' arrhythmias and DCM have AHA. The use of complex of noninvasive criteria allows estimate the probability of myocarditis and determine the indications for EMB.

Keywords: myocarditis, "idiopathic" arrhythmias, dilated cardiomyopathy, endomyocardial biopsy, viral genome, anti-heart antibodies

Procedia PDF Downloads 149
220 Fashion Performing/Fashioning Performances: Catwalks as Communication Tools between Market, Branding and Performing Art

Authors: V. Linfante

Abstract:

Catwalks are one of the key moments in fashion: the first and most relevant display where brands stage their collections, products, ideas, and style. The garment is 'the star' of the catwalk and must show itself not just as a product but as a result of a design process endured for several months. All contents developed within this process become ingredients for connecting scenography, music, lights, and direction into a unique fashion narrative. According to the spirit of different ages, fashion shows have been transformed and shaped into peculiar formats: from Pandoras to presentations organized by Parisian couturiers, across the 'marathons' typical of the beginning of modern fashion system, coming up to the present structure of fashion weeks, with their complex organization and related creative and technical businesses. The paper intends to introduce the evolution of the fashion system through its unique process of seasonally staging and showing its production. The paper intends to analyse the evolution of the fashion shows from the intimacy of ballrooms at the beginning of the 20th century, passing through the enthusiasm attitude typical from the '70s and the '80s, to finally depict our present. In this last scenario, catwalks are not anymore a standard collections presentation but become one of the most exciting expression of contemporary culture (and sub-cultures), going from sophisticated performances (as Karl Lagerfeld's Chanel shows) to real artistic happenings (as the events of Victor&Rolf, Alexander McQueen, OFF_WHITE, Vetements, and Martin Margiela), often involving contemporary architecture, digital world, technology, social media, performing art and artists.

Keywords: branding, communication, fashion, new media, performing art

Procedia PDF Downloads 127
219 Study of Storms on the Javits Center Green Roof

Authors: Alexander Cho, Harsho Sanyal, Joseph Cataldo

Abstract:

A quantitative analysis of the different variables on both the South and North green roofs of the Jacob K. Javits Convention Center was taken to find mathematical relationships between net radiation and evapotranspiration (ET), average outside temperature, and the lysimeter weight. Groups of datasets were analyzed, and the relationships were plotted on linear and semi-log graphs to find consistent relationships. Antecedent conditions for each rainstorm were also recorded and plotted against the volumetric water difference within the lysimeter. The first relation was the inverse parabolic relationship between the lysimeter weight and the net radiation and ET. The peaks and valleys of the lysimeter weight corresponded to valleys and peaks in the net radiation and ET respectively, with the 8/22/15 and 1/22/16 datasets showing this trend. The U-shaped and inverse U-shaped plots of the two variables coincided, indicating an inverse relationship between the two variables. Cross variable relationships were examined through graphs with lysimeter weight as the dependent variable on the y-axis. 10 out of 16 of the plots of lysimeter weight vs. outside temperature plots had R² values > 0.9. Antecedent conditions were also recorded for rainstorms, categorized by the amount of precipitation accumulating during the storm. Plotted against the change in the volumetric water weight difference within the lysimeter, a logarithmic regression was found with large R² values. The datasets were compared using the Mann Whitney U-test to see if the datasets were statistically different, using a significance level of 5%; all datasets compared showed a U test statistic value, proving the null hypothesis of the datasets being different from being true.

Keywords: green roof, green infrastructure, Javits Center, evapotranspiration, net radiation, lysimeter

Procedia PDF Downloads 85
218 Investigating Students' Understanding about Mathematical Concept through Concept Map

Authors: Rizky Oktaviana

Abstract:

The main purpose of studying lies in improving students’ understanding. Teachers usually use written test to measure students’ understanding about learning material especially mathematical learning material. This common method actually has a lack point, such that in mathematics content, written test only show procedural steps to solve mathematical problems. Therefore, teachers unable to see whether students actually understand about mathematical concepts and the relation between concepts or not. One of the best tools to observe students’ understanding about the mathematical concepts is concept map. The goal of this research is to describe junior high school students understanding about mathematical concepts through Concept Maps based on the difference of mathematical ability. There were three steps in this research; the first step was choosing the research subjects by giving mathematical ability test to students. The subjects of this research are three students with difference mathematical ability, high, intermediate and low mathematical ability. The second step was giving concept mapping training to the chosen subjects. The last step was giving concept mapping task about the function to the subjects. Nodes which are the representation of concepts of function were provided in concept mapping task. The subjects had to use the nodes in concept mapping. Based on data analysis, the result of this research shows that subject with high mathematical ability has formal understanding, due to that subject could see the connection between concepts of function and arranged the concepts become concept map with valid hierarchy. Subject with intermediate mathematical ability has relational understanding, because subject could arranged all the given concepts and gave appropriate label between concepts though it did not represent the connection specifically yet. Whereas subject with low mathematical ability has poor understanding about function, it can be seen from the concept map which is only used few of the given concepts because subject could not see the connection between concepts. All subjects have instrumental understanding for the relation between linear function concept, quadratic function concept and domain, co domain, range.

Keywords: concept map, concept mapping, mathematical concepts, understanding

Procedia PDF Downloads 250
217 Rationalizing the Utilization of Interactive Engagement Strategies in Teaching Specialized Science Courses of STEM and GA Strands in the Academic Track of Philippine Senior High School Curriculum

Authors: Raul G. Angeles

Abstract:

The Philippine government instituted major reforms in its educational system. The Department of Education pushes the K to 12 program that makes kindergarten mandatory and adds two years of senior high school to the country's basic education. In essence, the students’ stay in basic education particularly those who are supposedly going to college is extended. The majority of the students expressed that they will be taking the Academic Track of the Senior High School curriculum specifically the Science, Technology, Engineering and Mathematics (STEM) and General Academic (GA) strands. Almost certainly, instruction should match the students' styles and thus through this descriptive study a city survey was conducted to explore the teaching strategies preferences of junior high school students and teachers who will be promoted to senior high school during the Academic Year 2016-2017. This study was conducted in selected public and private secondary schools in Metro Manila. Questionnaires were distributed to students and teachers; and series of follow-up interviews were also carried out to generate additional information. Preferences of students are centered on employing innovations such as technology, cooperative and problem-based learning. While the students will still be covered by basic education their interests in science are sparking to a point where the usual teaching styles may no longer work to them and for that cause, altering the teaching methods is recommended to create a teacher-student style matching. Other effective strategies must likewise be implemented.

Keywords: curriculum development, effective teaching strategies, problem-based learning, senior high school, science education, technology

Procedia PDF Downloads 233
216 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality

Authors: David Sinfield, Thomas Cochrane, Marcos Steagall

Abstract:

While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.

Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications

Procedia PDF Downloads 221
215 Large Eddy Simulation with Energy-Conserving Schemes: Understanding Wind Farm Aerodynamics

Authors: Dhruv Mehta, Alexander van Zuijlen, Hester Bijl

Abstract:

Large Eddy Simulation (LES) numerically resolves the large energy-containing eddies of a turbulent flow, while modelling the small dissipative eddies. On a wind farm, these large scales carry the energy wind turbines extracts and are also responsible for transporting the turbines’ wakes, which may interact with downstream turbines and certainly with the atmospheric boundary layer (ABL). In this situation, it is important to conserve the energy that these wake’s carry and which could be altered artificially through numerical dissipation brought about by the schemes used for the spatial discretisation and temporal integration. Numerical dissipation has been reported to cause the premature recovery of turbine wakes, leading to an over prediction in the power produced by wind farms.An energy-conserving scheme is free from numerical dissipation and ensures that the energy of the wakes is increased or decreased only by the action of molecular viscosity or the action of wind turbines (body forces). The aim is to create an LES package with energy-conserving schemes to simulate wind turbine wakes correctly to gain insight into power-production, wake meandering etc. Such knowledge will be useful in designing more efficient wind farms with minimal wake interaction, which if unchecked could lead to major losses in energy production per unit area of the wind farm. For their research, the authors intend to use the Energy-Conserving Navier-Stokes code developed by the Energy Research Centre of the Netherlands.

Keywords: energy-conserving schemes, modelling turbulence, Large Eddy Simulation, atmospheric boundary layer

Procedia PDF Downloads 445
214 Methane versus Carbon Dioxide Mitigation Prospects

Authors: Alexander J. Severinsky, Allen L. Sessoms

Abstract:

Atmospheric carbon dioxide (CO₂) has dominated the discussion about the causes of climate change. This is a reflection of the time horizon that has become the norm adopted by the IPCC as the planning horizon. Recently, it has become clear that a 100-year time horizon is much too long, and yet almost all mitigation efforts, including those in the near-term horizon of 30 years, are geared toward it. In this paper, we show that, for a 30-year time horizon, methane (CH₄) is the greenhouse gas whose radiative forcing exceeds that of CO₂. In our analysis, we used radiative forcing of greenhouse gases in the atmosphere since they directly affect the temperature rise on Earth. In 2019, the radiative forcing of methane was ~2.5 W/m² and that of carbon dioxide ~2.1 W/m². Under a business-as-usual (BAU) scenario until 2050, such forcing would be ~2.8 W/m² and ~3.1 W/m², respectively. There is a substantial spread in the data for anthropogenic and natural methane emissions as well as CH₄ leakages from production to consumption. We estimated the minimum and maximum effects of the reduction of these leakages. Such action may reduce the annual radiative forcing of all CH₄ emissions by between ~15% and ~30%. This translates into a reduction of the RF by 2050 from ~2.8 W/m² to ~2.5 W/m² in the case of the minimum effect and to ~2.15 W/m² in the case of the maximum. Under the BAU, we found that the RF of CO₂ would increase from ~2.1 W/m² nowadays to ~3.1 W/m² by 2050. We assumed a reduction of 50% of anthropogenic emission linearly over the next 30 years. That would reduce radiative forcing from ~3.1 W/m² to ~2.9 W/m². In the case of ‘net zero,’ the other 50% of reduction of only anthropogenic emissions would be limited to either from sources of emissions or directly from the atmosphere. The total reduction would be from ~3.1 to ~2.7, or ~0.4 W/m². To achieve the same radiative forcing as in the scenario of maximum reduction of methane leakages of ~2.15 W/m², then an additional reduction of radiative forcing of CO₂ would be approximately 2.7 -2.15=0.55 W/m². This is a much larger value than in expectations from ‘net zero’. In total, one needs to remove from the atmosphere ~660 GT to match the maximum reduction of current methane leakages and ~270 GT to achieve ‘net zero.’ This amounts to over 900 GT in total.

Keywords: methane leakages, methane radiative forcing, methane mitigation, methane net zero

Procedia PDF Downloads 121
213 Analysis of the Learning Effectiveness of the Steam-6e Course: A Case Study on the Development of Virtual Idol Product Design as an Example

Authors: Mei-Chun. Chang

Abstract:

STEAM (Science, Technology, Engineering, Art, and Mathematics) represents a cross-disciplinary and learner-centered teaching model that cultivates students to link theory with the presentation of real situations, thereby improving their various abilities. This study explores students' learning performance after using the 6E model in STEAM teaching for a professional course in the digital media design department of technical colleges, as well as the difficulties and countermeasures faced by STEAM curriculum design and its implementation. In this study, through industry experts’ work experience, activity exchanges, course teaching, and experience, learners can think about the design and development value of virtual idol products that meet the needs of users and to employ AR/VR technology to innovate their product applications. Applying action research, the investigation has 35 junior students from the department of digital media design of the school where the researcher teaches as the research subjects. The teaching research was conducted over two stages spanning ten weeks and 30 sessions. This research collected the data and conducted quantitative and qualitative data sorting analyses through ‘design draft sheet’, ‘student interview record’, ‘STEAM Product Semantic Scale’, and ‘Creative Product Semantic Scale (CPSS)’. Research conclusions are presented, and relevant suggestions are proposed as a reference for teachers or follow-up researchers. The contribution of this study is to teach college students to develop original virtual idols and product designs, improve learning effectiveness through STEAM teaching activities, and effectively cultivate innovative and practical cross-disciplinary design talents.

Keywords: STEAM, 6E model, virtual idol, learning effectiveness, practical courses

Procedia PDF Downloads 99
212 Efficacy of Computer Mediated Power Point Presentations on Students' Learning Outcomes in Basic Science in Oyo State, Nigeria

Authors: Sunmaila Oyetunji Raimi, Olufemi Akinloye Bolaji, Abiodun Ezekiel Adesina

Abstract:

The lingering poor performance of students in basic science spells doom for a vibrant scientific and technological development which pivoted the economic, social and physical upliftment of any nation. This calls for identifying appropriate strategies for imparting basic science knowledge and attitudes to the teaming youths in secondary schools. This study, therefore, determined the impact of computer mediated power point presentations on students’ achievement in basic science in Oyo State, Nigeria. A pre-test, posttest, control group quazi-experimental design adopted for the study. Two hundred and five junior secondary two students selected using stratified random sampling technique participated in the study. Three research questions and three hypotheses guided the study. Two evaluative instruments – Students’ Basic Science Attitudes Scale (SBSAS, r = 0.91); Students’ Knowledge of Basic Science Test (SKBST, r = 0.82) were used for data collection. Descriptive statistics of mean, standard deviation and inferential statistics of ANCOVA, scheffe post-hoc test were used to analyse the data. The results indicated significant main effect of treatment on students cognitive (F(1,200)= 171.680; p < 0.05) and attitudinal (F(1,200)= 34.466; p < 0.05) achievement in Basic science with the experimental group having higher mean gain than the control group. Gender has significant main effect (F(1,200)= 23.382; p < 0.05) on students cognitive outcomes but not significant for attitudinal achievement in Basic science. The study therefore recommended among others that computer mediated power point presentations should be incorporated into curriculum methodology of Basic science in secondary schools.

Keywords: basic science, computer mediated power point presentations, gender, students’ achievement

Procedia PDF Downloads 406
211 Knowledge Transfer among Cross-Functional Teams as a Continual Improvement Process

Authors: Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander

Abstract:

The culture of continuous improvement in organizations is very important as it represents a source of competitive advantage. This article discusses the transfer of knowledge between companies which formed cross-functional teams and used a dynamic model for knowledge creation as a framework. In addition, the article discusses the structure of cognitive assets in companies and the concept of "stickiness" (which is defined as an obstacle to the transfer of knowledge). The purpose of this analysis is to show that an improvement in the attitude of individual members of an organization creates opportunities, and that an exchange of information and knowledge leads to generating continuous improvements in the company as a whole. This article also discusses the importance of creating the proper conditions for sharing tacit knowledge. By narrowing gaps between people, mutual trust can be created and thus contribute to an increase in sharing. The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving organizations. Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness". When developing the transfer process on cross-functional teams (as opposed to working groups), the team acquires the flexibility and responsiveness necessary to meet objectives. These types of cross-functional teams also generate synergy due to the array of different work backgrounds of their individuals. When synergy is established, a culture of continuous improvement is created.

Keywords: knowledge transfer, continuous improvement, teamwork, cognitive assets

Procedia PDF Downloads 304
210 'Get the DNR': Exploring the Impact of an Educational eModule on Internal Medicine Residents' Attitudes and Approaches to Goals of Care Conversations

Authors: Leora Branfield Day, Stephanie Saunders, Leah Steinberg, Shiphra Ginsburg, Christine Soong

Abstract:

Introduction: Discordance between patients expressed and documented preferences at the end of life is common. Although junior trainees frequently lead goals of care (GOC) conversations, lack of training can result in poor communication. Based on a needs assessment, we developed an interactive electronic learning module (eModule) for conducting patient-centred GOC discussions. The purpose of this study was to evaluate the impact of the eModule on residents’ attitudes towards GOC conversations. Methods: First-year internal medicine residents (n=11) from the University of Toronto selected using purposive sampling underwent semi-structured interviews before and after completing a GOC eModule. Interviews were anonymized, transcribed and open-coded using NVivo. Using a constructivist grounded theory approach, we developed a framework to understand the attitudes of residents to GOC conversations before and after viewing the module. Results: Before the module, participants described limited training and negative emotions towards GOC conversations. Many focused on code status and procedure choices (e.g., ventilation) instead of eliciting patient-centered values. Pressure to “get the DNR" led to conflicting feelings and distress. After the module, participants’ approached conversations with a greater focus on patient values and process. They felt more prepared and comfortable, recognizing the complexity of conversations and the importance of patient-centeredness. Conclusions: A novel GOC eModule allowed residents to develop a patient-centered and standardized approach to GOC conversations while improving confidence and preparedness. This resource could be an effective strategy toward attaining a critical communication competency among learners with the potential to enhance accurate GOC documentation.

Keywords: goals of care conversations, communication skills, emodule, medical education

Procedia PDF Downloads 118
209 Content Analysis of ‘Junk Food’ Content in Children’s TV Programmes: A Comparison of UK Broadcast TV and Video-On-Demand Services

Authors: Shreesh Sinha, Alexander B. Barker, Megan Parkin, Emma Wilson, Rachael L. Murray

Abstract:

Background and Objectives: Exposure to HFSS imagery is associated with the consumption of foods high in fat, sugar or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children's television channels in the UK and the other of a selection of children's programmes available on video-on-demand (VOD) streaming sites. Methods: Content analysis of three days' worth of programmes (including advertisements) on two popular children's television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children's programmes available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, 'Milkshake' had a significantly higher proportion of programmes/adverts which contained HFSS content than 'CBeebies'. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programmes containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children's television channels and children's programmes on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children's television programmes broadcast either on TV or VOD services are likely to have an effect on HFSS consumption in children, and legislative opportunities to prevent this exposure are being missed.

Keywords: public health, junk food, children's TV, HFSS

Procedia PDF Downloads 69
208 Select Communicative Approaches and Speaking Skills of Junior High School Students

Authors: Sonia Arradaza-Pajaron

Abstract:

Speaking English, as a medium of instruction among students who are non-native English speakers poses a real challenge to achieve proficiency, especially so if it is a requirement in most communicative classroom instruction. It becomes a real burden among students whose English language orientation is not well facilitated and encouraged by teachers among national high schools. This study, which utilized a descriptive-correlational research, examined the relationship between the select communicative approaches commonly utilized in classroom instruction to the level of speaking skills among the identified high school students. Survey questionnaires, interview, and observations sheets were researcher instruments used to generate salient information. Data were analyzed and treated statistically utilizing weighted mean speaking skills levels and Pearson r to determine the relationship between the two identified variables of the study. Findings revealed that the level of English speaking skills of the high school students is just average. Further, among the identified speaking sub-skills, namely, grammar, pronunciation and fluency, the students were considered above average level. There was also a clear relationship of some communicative approaches to the respondents’ speaking skills. Most notable among the select approaches is that of role-playing, compared to storytelling, informal debate, brainstorming, oral reporting, and others. It may be because role-playing is the most commonly used approach in the classroom. This implies that when these high school students are given enough time and autonomy on how they could express their ideas or comprehension of some lessons, they are shown to have a spontaneous manner of expression, through the maximization of the second language. It can be concluded further that high school students have the capacity to express ideas even in the second language, only if they are encouraged and well-facilitated by teachers. Also, when a better communicative approach is identified and better implemented, thus, will level up students’ classroom engagement.

Keywords: communicative approaches, comprehension, role playing, speaking skills

Procedia PDF Downloads 151
207 Urgent Care Centres in the United Kingdom

Authors: Mohammad Ansari, Satinder Mann, Ahmed Ismail

Abstract:

Primary care patients in Emergency Departments (ED) have been the topic of discussion since 1998 in the United Kingdom. Numerous studies have analysed attendances in EDs retrospectively and suggest that at least one third to fifty percent patients attending ED with problems which could be managed appropriately in General Practice or minor injuries units. The pattern of ED Usage seems to be International. In Australia and many departments in the United States include walk in facilities staffed by physicians on family practice residency programme. It clearly appears in the United Kingdom that EDs have to accept that such patients with primary care problems will attend the ED and facilities will have to be provided to see and treat such patients. Urgent care centres were introduced in the United Kingdom nearly a decade ago to reduce the pressure on EDs. Most of these were situated near pre-existing EDs. Unfortunately these centres failed to have the desired effect of reducing the number of patients visiting EDs, it has been noticed that when more patients were seen in Urgent Care centres there were increased attendances in ED as well. A new model of Urgent Care centre was started in the ED of George Eliot Hospital, Nuneaton, UK. We looked at the working of the centre by looking at the number of patients seen daily against the number of total attendances in the ED. We studied the number and type of patients seen by the Urgent Care Doctor. All the medical records of the patients were seen and the time patients spent in the Urgent Care centre was recorded. The total number of patients seen during this study were 1532. 219 (14.3% ) were seen within our Urgent Care centre. None of the patients waited over four hours to be seen. It has been recognised that primary care patients in the ED are a major part of attendances of the department and unless these patients are seen in Urgent Care centres, overcrowding and long waits cannot been avoided. It has been shown that employing primary care Physicians in Urgent Care centres reduces overall cost because they do not carry out as many investigations as Junior Doctors. In our study over 14% patients were seen by Urgent Care Physicians and none of the patients waited for more than four hours and we feel that care provided to the patients by Urgent Care centre was highly effective and satisfying for the patient.

Keywords: urgent care centres, primary care physicians, overcrowding, cost

Procedia PDF Downloads 416
206 Compliance of Systematic Reviews in Plastic Surgery with the PRISMA Statement: A Systematic Review

Authors: Seon-Young Lee, Harkiran Sagoo, Katherine Whitehurst, Georgina Wellstead, Alexander Fowler, Riaz Agha, Dennis Orgill

Abstract:

Introduction: Systematic reviews attempt to answer research questions by synthesising the data within primary papers. They are an increasingly important tool within evidence-based medicine, guiding clinical practice, future research and healthcare policy. We sought to determine the reporting quality of recent systematic reviews in plastic surgery. Methods: This systematic review was conducted in line with the Cochrane handbook, reported in line with the PRISMA statement and registered at the ResearchRegistry (UIN: reviewregistry18). MEDLINE and EMBASE databases were searched in 2013 and 2014 for systematic reviews by five major plastic surgery journals. Screening, identification and data extraction was performed independently by two teams. Results: From an initial set of 163 articles, 79 met the inclusion criteria. The median PRISMA score was 16 out of 27 items (59.3%; range 6-26, 95% CI 14-17). Compliance between individual PRISMA items showed high variability. It was poorest for items related to the use of review protocol (item 5; 5%) and presentation of data on risk of bias of each study (item 19; 18%), while being the highest for description of rationale (item 3; 99%) and sources of funding and other support (item 27; 95%), and for structured summary in the abstract (item 2; 95%). Conclusion: The reporting quality of systematic reviews in plastic surgery requires improvement. ‘Hard-wiring’ of compliance through journal submission systems, as well as improved education, awareness and a cohesive strategy among all stakeholders is called for.

Keywords: PRISMA, reporting quality, plastic surgery, systematic review, meta-analysis

Procedia PDF Downloads 272
205 Estimation of Endogenous Brain Noise from Brain Response to Flickering Visual Stimulation Magnetoencephalography Visual Perception Speed

Authors: Alexander N. Pisarchik, Parth Chholak

Abstract:

Intrinsic brain noise was estimated via magneto-encephalograms (MEG) recorded during perception of flickering visual stimuli with frequencies of 6.67 and 8.57 Hz. First, we measured the mean phase difference between the flicker signal and steady-state event-related field (SSERF) in the occipital area where the brain response at the flicker frequencies and their harmonics appeared in the power spectrum. Then, we calculated the probability distribution of the phase fluctuations in the regions of frequency locking and computed its kurtosis. Since kurtosis is a measure of the distribution’s sharpness, we suppose that inverse kurtosis is related to intrinsic brain noise. In our experiments, the kurtosis value varied among subjects from K = 3 to K = 5 for 6.67 Hz and from 2.6 to 4 for 8.57 Hz. The majority of subjects demonstrated leptokurtic kurtosis (K < 3), i.e., the distribution tails approached zero more slowly than Gaussian. In addition, we found a strong correlation between kurtosis and brain complexity measured as the correlation dimension, so that the MEGs of subjects with higher kurtosis exhibited lower complexity. The obtained results are discussed in the framework of nonlinear dynamics and complex network theories. Specifically, in a network of coupled oscillators, phase synchronization is mainly determined by two antagonistic factors, noise, and the coupling strength. While noise worsens phase synchronization, the coupling improves it. If we assume that each neuron and each synapse contribute to brain noise, the larger neuronal network should have stronger noise, and therefore phase synchronization should be worse, that results in smaller kurtosis. The described method for brain noise estimation can be useful for diagnostics of some brain pathologies associated with abnormal brain noise.

Keywords: brain, flickering, magnetoencephalography, MEG, visual perception, perception time

Procedia PDF Downloads 114
204 Sports Business Services Model: A Research Model Study in Reginal Sport Authority of Thailand

Authors: Siriraks Khawchaimaha, Sangwian Boonto

Abstract:

Sport Authority of Thailand (SAT) is the state enterprise, promotes and supports all sports kind both professional and athletes for competitions, and administer under government policy and government officers and therefore, all financial supports whether cash inflows and cash outflows are strictly committed to government budget and limited to the planned projects at least 12 to 16 months ahead of reality, as results of ineffective in sport events, administration and competitions. In order to retain in the sports challenges around the world, SAT need to has its own sports business services model by each stadium, region and athletes’ competencies. Based on the HMK model of Khawchaimaha, S. (2007), this research study is formalized into each 10 regional stadiums to details into the characteristics root of fans, athletes, coaches, equipments and facilities, and stadiums. The research designed is firstly the evaluation of external factors: hardware whereby competition or practice of stadiums, playground, facilities, and equipments. Secondly, to understand the software of the organization structure, staffs and management, administrative model, rules and practices. In addition, budget allocation and budget administration with operating plan and expenditure plan. As results for the third step, issues and limitations which require action plan for further development and support, or to cease that unskilled sports kind. The final step, based on the HMK model and modeling canvas by Alexander O and Yves P (2010) are those of template generating Sports Business Services Model for each 10 SAT’s regional stadiums.

Keywords: HMK model, not for profit organization, sport business model, sport services model

Procedia PDF Downloads 284
203 Trading off Accuracy for Speed in Powerdrill

Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica

Abstract:

In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.

Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries

Procedia PDF Downloads 233
202 A Content Analysis of ‘Junk Food’ Content in Children’s TV Programs: A Comparison of UK Broadcast TV and Video-On-Demand Services

Authors: Alexander B. Barker, Megan Parkin, Shreesh Sinha, Emma Wilson, Rachael L. Murray

Abstract:

Objectives: Exposure to HFSS imagery is associated with consumption of foods high in fat, sugar, or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children’s television channels in the UK and the other of a selection of children’s programs available on video-on-demand (VOD) streaming sites. Design: Content analysis of three days’ worth of programs (including advertisements) on two popular children’s television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children’s programs available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Setting: United Kingdom, Participants: None. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, ‘Milkshake’ had a significantly higher proportion of programs/adverts which contained HFSS content than ‘CBeebies’. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programs containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children’s television channels and children's programs on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children’s television programs broadcast either on TV or VOD services are likely having an effect on HFSS consumption in children and legislative opportunities to prevent this exposure are being missed.

Keywords: public health, epidemiology, obesity, content analysis

Procedia PDF Downloads 152
201 Effects of Educational Technology Integration in Classroom Instruction to the Math Performance of Generation Z Students of a Private High School in the Philippines

Authors: May Maricel De Gracia

Abstract:

Different generations respond differently to instruction because of their diverse characteristics, learning styles and study habits. Teaching strategies that were effective many years ago may not be effective now especially to the current generation which is Gen Z. Using quantitative research design, the main goal of this paper is to determine the impact of the implementation of educational technology integration in a private high school in the math performance of its Junior High School (JHS) students on SY 2014-2018 based on their periodical exam performance and on their final math grades. In support, survey on the use of technology was administered to determine the characteristics of both students and teachers of SY 2017-2018. Another survey regarding study habits was also administered to the students to determine their readiness with regards to note-taking skills, time management, test taking/preparation skills, reading, and writing and math skills. Teaching strategies were recommended based on the need of the current Gen Z JHS students. A total of 712 JHS students and 12 math teachers participated in answering the different surveys. Periodic exam means and final math grades between the school years without technology (SY 2004-2008) and with technology (SY 2014-2018) were analyzed through correlation and regression analyses. Result shows that the periodic exam mean has a 35.29% impact to the final grade of the students. In addition, z-test result where p > 0.05 shows that the periodical exam results do not differ significantly between the school years without integration of technology and with the integration of technology. However, with p < 0.01, a significant positive difference was observed in the final math grades of students between the school years without technology integration and with technology integration.

Keywords: classroom instruction, technology, generation z, math performance

Procedia PDF Downloads 124
200 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria

Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov

Abstract:

This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.

Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model

Procedia PDF Downloads 28
199 Modified Clusterwise Regression for Pavement Management

Authors: Mukesh Khadka, Alexander Paz, Hanns de la Fuente-Mella

Abstract:

Typically, pavement performance models are developed in two steps: (i) pavement segments with similar characteristics are grouped together to form a cluster, and (ii) the corresponding performance models are developed using statistical techniques. A challenge is to select the characteristics that define clusters and the segments associated with them. If inappropriate characteristics are used, clusters may include homogeneous segments with different performance behavior or heterogeneous segments with similar performance behavior. Prediction accuracy of performance models can be improved by grouping the pavement segments into more uniform clusters by including both characteristics and a performance measure. This grouping is not always possible due to limited information. It is impractical to include all the potential significant factors because some of them are potentially unobserved or difficult to measure. Historical performance of pavement segments could be used as a proxy to incorporate the effect of the missing potential significant factors in clustering process. The current state-of-the-art proposes Clusterwise Linear Regression (CLR) to determine the pavement clusters and the associated performance models simultaneously. CLR incorporates the effect of significant factors as well as a performance measure. In this study, a mathematical program was formulated for CLR models including multiple explanatory variables. Pavement data collected recently over the entire state of Nevada were used. International Roughness Index (IRI) was used as a pavement performance measure because it serves as a unified standard that is widely accepted for evaluating pavement performance, especially in terms of riding quality. Results illustrate the advantage of the using CLR. Previous studies have used CLR along with experimental data. This study uses actual field data collected across a variety of environmental, traffic, design, and construction and maintenance conditions.

Keywords: clusterwise regression, pavement management system, performance model, optimization

Procedia PDF Downloads 228
198 More Precise: Patient-Reported Outcomes after Stroke

Authors: Amber Elyse Corrigan, Alexander Smith, Anna Pennington, Ben Carter, Jonathan Hewitt

Abstract:

Background and Purpose: Morbidity secondary to stroke is highly heterogeneous, but it is important to both patients and clinicians in post-stroke management and adjustment to life after stroke. The consideration of post-stroke morbidity clinically and from the patient perspective has been poorly measured. The patient-reported outcome measures (PROs) in morbidity assessment help improve this knowledge gap. The primary aim of this study was to consider the association between PRO outcomes and stroke predictors. Methods: A multicenter prospective cohort study assessed 549 stroke patients at 19 hospital sites across England and Wales during 2019. Following a stroke event, demographic, clinical, and PRO measures were collected. Prevalence of morbidity within PRO measures was calculated with associated 95% confidence intervals. Predictors of domain outcome were calculated using a multilevel generalized linear model. Associated P -values and 95% confidence intervals are reported. Results: Data were collected from 549 participants, 317 men (57.7%) and 232 women (42.3%) with ages ranging from 25 to 97 (mean 72.7). PRO morbidity was high post-stroke; 93.2% of the cohort report post-stroke PRO morbidity. Previous stroke, diabetes, and gender are associated with worse patient-reported outcomes across both the physical and cognitive domains. Conclusions: This large-scale multicenter cohort study illustrates the high proportion of morbidity in PRO measures. Further, we demonstrate key predictors of adverse outcomes (Diabetes, previous stroke, and gender) congruence with clinical predictors. The PRO has been demonstrated to be an informative and useful stroke when considering patient-reported outcomes and has wider implications for considerations of PROs in clinical management. Future longitudinal follow-up with PROs is needed to consider association of long-term morbidity.

Keywords: morbidity, patient-reported outcome, PRO, stroke

Procedia PDF Downloads 107
197 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms

Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann

Abstract:

Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.

Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI

Procedia PDF Downloads 155