Search results for: Garzon Alvarado Diego Alexander
211 Stimulation of Stevioside Accumulation on Stevia rebaudiana (Bertoni) Shoot Culture Induced with Red LED Light in TIS RITA® Bioreactor System
Authors: Vincent Alexander, Rizkita Esyanti
Abstract:
Leaves of Stevia rebaudiana contain steviol glycoside which mainly comprise of stevioside, a natural sweetener compound that is 100-300 times sweeter than sucrose. Current cultivation method of Stevia rebaudiana in Indonesia has yet to reach its optimum efficiency and productivity to produce stevioside as a safe sugar substitute sweetener for people with diabetes. An alternative method that is not limited by environmental factor is in vitro temporary immersion system (TIS) culture method using recipient for automated immersion (RITA®) bioreactor. The aim of this research was to evaluate the effect of red LED light induction towards shoot growth and stevioside accumulation in TIS RITA® bioreactor system, as an endeavour to increase the secondary metabolite synthesis. The result showed that the stevioside accumulation in TIS RITA® bioreactor system induced with red LED light for one hour during night was higher than that in TIS RITA® bioreactor system without red LED light induction, i.e. 71.04 ± 5.36 μg/g and 42.92 ± 5.40 μg/g respectively. Biomass growth rate reached as high as 0.072 ± 0.015/day for red LED light induced TIS RITA® bioreactor system, whereas TIS RITA® bioreactor system without induction was only 0.046 ± 0.003/day. Productivity of Stevia rebaudiana shoots induced with red LED light was 0.065 g/L medium/day, whilst shoots without any induction was 0.041 g/L medium/day. Sucrose, salt, and inorganic consumption in both bioreactor media increased as biomass increased. It can be concluded that Stevia rebaudiana shoot in TIS RITA® bioreactor induced with red LED light produces biomass and accumulates higher stevioside concentration, in comparison to bioreactor without any light induction.Keywords: LED, Stevia rebaudiana, Stevioside, TIS RITA
Procedia PDF Downloads 371210 Microstructure Study of Melt Spun Mg₆₅Cu₂₅Y₁₀
Authors: Michael Regev, Shai Essel, Alexander Katz-Demyanetz
Abstract:
Magnesium alloys are characterized by good physical properties: They exhibit high strength, are lightweight and have good damping absorption and good thermal and electrical conductivity. Amorphous magnesium alloys, moreover, exhibit higher strength, hardness and a large elastic domain in addition to having excellent corrosion resistance. These above-mentioned advantages make magnesium based metallic glasses attractive for industrial use. Among the various existing magnesium alloys, Mg₆₅Cu₂₅Y₁₀ alloy is known to be one of the best glass formers. In the current study, Mg₆₅Cu₂₅Y₁₀ ribbons were produced by melt spinning, their microstructure was investigated in its as-cast condition, after pressing under 0.5 GPa for 5 minutes under different temperatures - RT, 500C, 1000C, 1500C and 2000C - and after five minute exposure to the above temperatures without pressing. The microstructure was characterized by means of X-ray Diffraction (XRD), Differential Scanning Calorimetry (DSC), High Resolution Scanning Electron Microscope (HRSEM) and High Resolution Transmission Electron Microscopy (HRTEM). XRD and DSC studies showed that the as-cast material had an amorphous character and that the material crystallized during exposure to temperature with or without applying stress. HRTEM revealed that the as-cast Mg65Cu25Y10, although known to be one of the best glass formers, is nano-crystalline rather than amorphous. The current study casts light on the question what an amorphous alloy is and whether there is any clear borderline between amorphous and nano-crystalline alloys.Keywords: metallic glass, magnesium, melt spinning, amorphous alloys
Procedia PDF Downloads 236209 The Impact of Music on Social Identity Formation and Intergroup Relations in American-Born Korean Skaters in 2018 Winter Olympics
Authors: Sehwan Kim, Jepkorir Rose Chepyator Thomson
Abstract:
Music provides opportunities to affirm social identities and facilitate the internalization of one’s identity. The purpose of this study was to examine the role of music in breaking down boundaries between the in-group and out-of-group sport participants. Social identity theory was used to guide an understanding of two American-born South Korean skaters—Yura Min and Alexander Gamelin—who used a Korean representative traditional folk song, Arirang, at the 2018 Winter Olympics. This was an interpretive case study that focused on 2018 Winter Olympic participants whose performance and use of music was understood through the lenses of Koreans. Semi-structured interviews were conducted with 15 Korean audiences who watched two American-born South Korean skaters’ performances. Data analysis involved the determination of themes in the data collected. The findings of this study are as follows: First Koreans viewed the skaters as the out-group based on ethnic appearances and stereotypes. Second, Koreans’ inter-group bias against the skaters was meditated after Koreans watched the skaters as they used Arirang song in performance. Implications for this study include the importance of music as an instrument of unity across diverse populations, including intergroup relations. Music can also offer ways to understand people’s cultures and bridge gaps between age and gender across categories of naturalization.Keywords: impact of music, intergroup relations, naturalized athletes, social identity theory
Procedia PDF Downloads 207208 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 32207 Optimal Concentration of Fluorescent Nanodiamonds in Aqueous Media for Bioimaging and Thermometry Applications
Authors: Francisco Pedroza-Montero, Jesús Naín Pedroza-Montero, Diego Soto-Puebla, Osiris Alvarez-Bajo, Beatriz Castaneda, Sofía Navarro-Espinoza, Martín Pedroza-Montero
Abstract:
Nanodiamonds have been widely studied for their physical properties, including chemical inertness, biocompatibility, optical transparency from the ultraviolet to the infrared region, high thermal conductivity, and mechanical strength. In this work, we studied how the fluorescence spectrum of nanodiamonds quenches concerning the concentration in aqueous solutions systematically ranging from 0.1 to 10 mg/mL. Our results demonstrated a non-linear fluorescence quenching as the concentration increases for both of the NV zero-phonon lines; the 5 mg/mL concentration shows the maximum fluorescence emission. Furthermore, this behaviour is theoretically explained as an electronic recombination process that modulates the intensity in the NV centres. Finally, to gain more insight, the FRET methodology is used to determine the fluorescence efficiency in terms of the fluorophores' separation distance. Thus, the concentration level is simulated as follows, a small distance between nanodiamonds would be considered a highly concentrated system, whereas a large distance would mean a low concentrated one. Although the 5 mg/mL concentration shows the maximum intensity, our main interest is focused on the concentration of 0.5 mg/mL, which our studies demonstrate the optimal human cell viability (99%). In this respect, this concentration has the feature of being as biocompatible as water giving the possibility to internalize it in cells without harming the living media. To this end, not only can we track nanodiamonds on the surface or inside the cell with excellent precision due to their fluorescent intensity, but also, we can perform thermometry tests transforming a fluorescence contrast image into a temperature contrast image.Keywords: nanodiamonds, fluorescence spectroscopy, concentration, bioimaging, thermometry
Procedia PDF Downloads 405206 Thermal Simulation for Urban Planning in Early Design Phases
Authors: Diego A. Romero Espinosa
Abstract:
Thermal simulations are used to evaluate comfort and energy consumption of buildings. However, the performance of different urban forms cannot be assessed precisely if an environmental control system and user schedules are considered. The outcome of such analysis would lead to conclusions that combine the building use, operation, services, envelope, orientation and density of the urban fabric. The influence of these factors varies during the life cycle of a building. The orientation, as well as the surroundings, can be considered a constant during the lifetime of a building. The structure impacts the thermal inertia and has the largest lifespan of all the building components. On the other hand, the building envelope is the most frequent renovated component of a building since it has a great impact on energy performance and comfort. Building services have a shorter lifespan and are replaced regularly. With the purpose of addressing the performance, an urban form, a specific orientation, and density, a thermal simulation method were developed. The solar irradiation is taken into consideration depending on the outdoor temperature. Incoming irradiation at low temperatures has a positive impact increasing the indoor temperature. Consequently, overheating would be the combination of high outdoor temperature and high irradiation at the façade. On this basis, the indoor temperature is simulated for a specific orientation of the evaluated urban form. Thermal inertia and building envelope performance are considered additionally as the materiality of the building. The results of different thermal zones are summarized using the 'Degree day method' for cooling and heating. During the early phase of a design process for a project, such as Masterplan, conclusions regarding urban form, density and materiality can be drawn by means of this analysis.Keywords: building envelope, density, masterplanning, urban form
Procedia PDF Downloads 145205 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema
Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy
Abstract:
Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet
Procedia PDF Downloads 311204 The Low-Cost Design and 3D Printing of Structural Knee Orthotics for Athletic Knee Injury Patients
Authors: Alexander Hendricks, Sean Nevin, Clayton Wikoff, Melissa Dougherty, Jacob Orlita, Rafiqul Noorani
Abstract:
Knee orthotics play an important role in aiding in the recovery of those with knee injuries, especially athletes. However, structural knee orthotics is often very expensive, ranging between $300 and $800. The primary reason for this project was to answer the question: can 3D printed orthotics represent a viable and cost-effective alternative to present structural knee orthotics? The primary objective for this research project was to design a knee orthotic for athletes with knee injuries for a low-cost under $100 and evaluate its effectiveness. The initial design for the orthotic was done in SolidWorks, a computer-aided design (CAD) software available at Loyola Marymount University. After this design was completed, finite element analysis (FEA) was utilized to understand how normal stresses placed upon the knee affected the orthotic. The knee orthotic was then adjusted and redesigned to meet a specified factor-of-safety of 3.25 based on the data gathered during FEA and literature sources. Once the FEA was completed and the orthotic was redesigned based from the data gathered, the next step was to move on to 3D-printing the first design of the knee brace. Subsequently, physical therapy movement trials were used to evaluate physical performance. Using the data from these movement trials, the CAD design of the brace was refined to accommodate the design requirements. The final goal of this research means to explore the possibility of replacing high-cost, outsourced knee orthotics with a readily available low-cost alternative.Keywords: 3D printing, knee orthotics, finite element analysis, design for additive manufacturing
Procedia PDF Downloads 181203 Diagnostic Value of Different Noninvasive Criteria of Latent Myocarditis in Comparison with Myocardial Biopsy
Authors: Olga Blagova, Yuliya Osipova, Evgeniya Kogan, Alexander Nedostup
Abstract:
Purpose: to quantify the value of various clinical, laboratory and instrumental signs in the diagnosis of myocarditis in comparison with morphological studies of the myocardium. Methods: in 100 patients (65 men, 44.7±12.5 years) with «idiopathic» arrhythmias (n = 20) and dilated cardiomyopathy (DCM, n = 80) were performed 71 endomyocardial biopsy (EMB), 13 intraoperative biopsy, 5 study of explanted hearts, 11 autopsy with virus investigation (real-time PCR) of the blood and myocardium. Anti-heart antibodies (AHA) were also measured as well as cardiac CT (n = 45), MRI (n = 25), coronary angiography (n = 47). The comparison group included of 50 patients (25 men, 53.7±11.7 years) with non-inflammatory heart diseases who underwent open heart surgery. Results. Active/borderline myocarditis was diagnosed in 76.0% of the study group and in 21.6% of patients of the comparison group (p < 0.001). The myocardial viral genome was observed more frequently in patients of comparison group than in study group (group (65.0% and 40.2%; p < 0.01. Evaluated the diagnostic value of noninvasive markers of myocarditis. The panel of anti-heart antibodies had the greatest importance to identify myocarditis: sensitivity was 81.5%, positive and negative predictive value was 75.0 and 60.5%. It is defined diagnostic value of non-invasive markers of myocarditis and diagnostic algorithm providing an individual assessment of the likelihood of myocarditis is developed. Conclusion. The greatest significance in the diagnosis of latent myocarditis in patients with 'idiopathic' arrhythmias and DCM have AHA. The use of complex of noninvasive criteria allows estimate the probability of myocarditis and determine the indications for EMB.Keywords: myocarditis, "idiopathic" arrhythmias, dilated cardiomyopathy, endomyocardial biopsy, viral genome, anti-heart antibodies
Procedia PDF Downloads 173202 Fashion Performing/Fashioning Performances: Catwalks as Communication Tools between Market, Branding and Performing Art
Authors: V. Linfante
Abstract:
Catwalks are one of the key moments in fashion: the first and most relevant display where brands stage their collections, products, ideas, and style. The garment is 'the star' of the catwalk and must show itself not just as a product but as a result of a design process endured for several months. All contents developed within this process become ingredients for connecting scenography, music, lights, and direction into a unique fashion narrative. According to the spirit of different ages, fashion shows have been transformed and shaped into peculiar formats: from Pandoras to presentations organized by Parisian couturiers, across the 'marathons' typical of the beginning of modern fashion system, coming up to the present structure of fashion weeks, with their complex organization and related creative and technical businesses. The paper intends to introduce the evolution of the fashion system through its unique process of seasonally staging and showing its production. The paper intends to analyse the evolution of the fashion shows from the intimacy of ballrooms at the beginning of the 20th century, passing through the enthusiasm attitude typical from the '70s and the '80s, to finally depict our present. In this last scenario, catwalks are not anymore a standard collections presentation but become one of the most exciting expression of contemporary culture (and sub-cultures), going from sophisticated performances (as Karl Lagerfeld's Chanel shows) to real artistic happenings (as the events of Victor&Rolf, Alexander McQueen, OFF_WHITE, Vetements, and Martin Margiela), often involving contemporary architecture, digital world, technology, social media, performing art and artists.Keywords: branding, communication, fashion, new media, performing art
Procedia PDF Downloads 150201 Study of Storms on the Javits Center Green Roof
Authors: Alexander Cho, Harsho Sanyal, Joseph Cataldo
Abstract:
A quantitative analysis of the different variables on both the South and North green roofs of the Jacob K. Javits Convention Center was taken to find mathematical relationships between net radiation and evapotranspiration (ET), average outside temperature, and the lysimeter weight. Groups of datasets were analyzed, and the relationships were plotted on linear and semi-log graphs to find consistent relationships. Antecedent conditions for each rainstorm were also recorded and plotted against the volumetric water difference within the lysimeter. The first relation was the inverse parabolic relationship between the lysimeter weight and the net radiation and ET. The peaks and valleys of the lysimeter weight corresponded to valleys and peaks in the net radiation and ET respectively, with the 8/22/15 and 1/22/16 datasets showing this trend. The U-shaped and inverse U-shaped plots of the two variables coincided, indicating an inverse relationship between the two variables. Cross variable relationships were examined through graphs with lysimeter weight as the dependent variable on the y-axis. 10 out of 16 of the plots of lysimeter weight vs. outside temperature plots had R² values > 0.9. Antecedent conditions were also recorded for rainstorms, categorized by the amount of precipitation accumulating during the storm. Plotted against the change in the volumetric water weight difference within the lysimeter, a logarithmic regression was found with large R² values. The datasets were compared using the Mann Whitney U-test to see if the datasets were statistically different, using a significance level of 5%; all datasets compared showed a U test statistic value, proving the null hypothesis of the datasets being different from being true.Keywords: green roof, green infrastructure, Javits Center, evapotranspiration, net radiation, lysimeter
Procedia PDF Downloads 114200 Crickets as Social Business Model for Rural Women in Colombia
Authors: Diego Cruz, Helbert Arevalo, Diana Vernot
Abstract:
In 2013, the Food and Agriculture Organization of the United Nations (FAO) said that insect production for food and feed could become an economic opportunity for rural women in developing countries. However, since then, just a few initiatives worldwide had tried to implement this kind of project in zones of tropical countries without previous experience in cricket production and insect human consumption, such as Colombia. In this project, ArthroFood company and the University of La Sabana join efforts to make a holistic multi-perspective analysis from biological, economic, culinary, and social sides of the Gryllodes sigillatus production by rural women of the municipality of La Mesa, Cundinamarca, Colombia. From a biological and economic perspective, G. sigillatus production in a 60m2 greenhouse was evaluated considering the effect of rearing density and substrates on final weight and length, developing time, survival rate, and proximate composition. Additionally, the production cost and labor hours were recorded for five months. On the other hand, from a socio- economic side, the intention of the rural women to implement cricket farms or micro-entrepreneurship around insect production was evaluated after developing ethnographies and empowerment, entrepreneurship, and cricket production workshops. Finally, the results of the elaboration of culinary recipes with cricket powder incorporating cultural aspects of the context of La Mesa, Cundinamarca, will be presented. This project represents Colombia's first attempt to create a social business model of cricket production involving rural women, academies, the private sector, and local authorities.Keywords: cricket production, developing country, edible insects, entrepreneurship, insect culinary recipes
Procedia PDF Downloads 104199 Large Eddy Simulation with Energy-Conserving Schemes: Understanding Wind Farm Aerodynamics
Authors: Dhruv Mehta, Alexander van Zuijlen, Hester Bijl
Abstract:
Large Eddy Simulation (LES) numerically resolves the large energy-containing eddies of a turbulent flow, while modelling the small dissipative eddies. On a wind farm, these large scales carry the energy wind turbines extracts and are also responsible for transporting the turbines’ wakes, which may interact with downstream turbines and certainly with the atmospheric boundary layer (ABL). In this situation, it is important to conserve the energy that these wake’s carry and which could be altered artificially through numerical dissipation brought about by the schemes used for the spatial discretisation and temporal integration. Numerical dissipation has been reported to cause the premature recovery of turbine wakes, leading to an over prediction in the power produced by wind farms.An energy-conserving scheme is free from numerical dissipation and ensures that the energy of the wakes is increased or decreased only by the action of molecular viscosity or the action of wind turbines (body forces). The aim is to create an LES package with energy-conserving schemes to simulate wind turbine wakes correctly to gain insight into power-production, wake meandering etc. Such knowledge will be useful in designing more efficient wind farms with minimal wake interaction, which if unchecked could lead to major losses in energy production per unit area of the wind farm. For their research, the authors intend to use the Energy-Conserving Navier-Stokes code developed by the Energy Research Centre of the Netherlands.Keywords: energy-conserving schemes, modelling turbulence, Large Eddy Simulation, atmospheric boundary layer
Procedia PDF Downloads 465198 Methane versus Carbon Dioxide Mitigation Prospects
Authors: Alexander J. Severinsky, Allen L. Sessoms
Abstract:
Atmospheric carbon dioxide (CO₂) has dominated the discussion about the causes of climate change. This is a reflection of the time horizon that has become the norm adopted by the IPCC as the planning horizon. Recently, it has become clear that a 100-year time horizon is much too long, and yet almost all mitigation efforts, including those in the near-term horizon of 30 years, are geared toward it. In this paper, we show that, for a 30-year time horizon, methane (CH₄) is the greenhouse gas whose radiative forcing exceeds that of CO₂. In our analysis, we used radiative forcing of greenhouse gases in the atmosphere since they directly affect the temperature rise on Earth. In 2019, the radiative forcing of methane was ~2.5 W/m² and that of carbon dioxide ~2.1 W/m². Under a business-as-usual (BAU) scenario until 2050, such forcing would be ~2.8 W/m² and ~3.1 W/m², respectively. There is a substantial spread in the data for anthropogenic and natural methane emissions as well as CH₄ leakages from production to consumption. We estimated the minimum and maximum effects of the reduction of these leakages. Such action may reduce the annual radiative forcing of all CH₄ emissions by between ~15% and ~30%. This translates into a reduction of the RF by 2050 from ~2.8 W/m² to ~2.5 W/m² in the case of the minimum effect and to ~2.15 W/m² in the case of the maximum. Under the BAU, we found that the RF of CO₂ would increase from ~2.1 W/m² nowadays to ~3.1 W/m² by 2050. We assumed a reduction of 50% of anthropogenic emission linearly over the next 30 years. That would reduce radiative forcing from ~3.1 W/m² to ~2.9 W/m². In the case of ‘net zero,’ the other 50% of reduction of only anthropogenic emissions would be limited to either from sources of emissions or directly from the atmosphere. The total reduction would be from ~3.1 to ~2.7, or ~0.4 W/m². To achieve the same radiative forcing as in the scenario of maximum reduction of methane leakages of ~2.15 W/m², then an additional reduction of radiative forcing of CO₂ would be approximately 2.7 -2.15=0.55 W/m². This is a much larger value than in expectations from ‘net zero’. In total, one needs to remove from the atmosphere ~660 GT to match the maximum reduction of current methane leakages and ~270 GT to achieve ‘net zero.’ This amounts to over 900 GT in total.Keywords: methane leakages, methane radiative forcing, methane mitigation, methane net zero
Procedia PDF Downloads 146197 Biomechanical Performance of the Synovial Capsule of the Glenohumeral Joint with a BANKART Lesion through Finite Element Analysis
Authors: Duvert A. Puentes T., Javier A. Maldonado E., Ivan Quintero., Diego F. Villegas
Abstract:
Mechanical Computation is a great tool to study the performance of complex models. An example of it is the study of the human body structure. This paper took advantage of different types of software to make a 3D model of the glenohumeral joint and apply a finite element analysis. The main objective was to study the change in the biomechanical properties of the joint when it presents an injury. Specifically, a BANKART lesion, which consists in the detachment of the anteroinferior labrum from the glenoid. Stress and strain distribution of the soft tissues were the focus of this study. First, a 3D model was made of a joint without any pathology, as a control sample, using segmentation software for the bones with the support of medical imagery and a cadaveric model to represent the soft tissue. The joint was built to simulate a compression and external rotation test using CAD to prepare the model in the adequate position. When the healthy model was finished, it was submitted to a finite element analysis and the results were validated with experimental model data. With the validated model, it was sensitized to obtain the best mesh measurement. Finally, the geometry of the 3D model was changed to imitate a BANKART lesion. Then, the contact zone of the glenoid with the labrum was slightly separated simulating a tissue detachment. With this new geometry, the finite element analysis was applied again, and the results were compared with the control sample created initially. With the data gathered, this study can be used to improve understanding of the labrum tears. Nevertheless, it is important to remember that the computational analysis are approximations and the initial data was taken from an in vitro assay.Keywords: biomechanics, computational model, finite elements, glenohumeral joint, bankart lesion, labrum
Procedia PDF Downloads 161196 Knowledge Transfer among Cross-Functional Teams as a Continual Improvement Process
Authors: Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander
Abstract:
The culture of continuous improvement in organizations is very important as it represents a source of competitive advantage. This article discusses the transfer of knowledge between companies which formed cross-functional teams and used a dynamic model for knowledge creation as a framework. In addition, the article discusses the structure of cognitive assets in companies and the concept of "stickiness" (which is defined as an obstacle to the transfer of knowledge). The purpose of this analysis is to show that an improvement in the attitude of individual members of an organization creates opportunities, and that an exchange of information and knowledge leads to generating continuous improvements in the company as a whole. This article also discusses the importance of creating the proper conditions for sharing tacit knowledge. By narrowing gaps between people, mutual trust can be created and thus contribute to an increase in sharing. The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving organizations. Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness". When developing the transfer process on cross-functional teams (as opposed to working groups), the team acquires the flexibility and responsiveness necessary to meet objectives. These types of cross-functional teams also generate synergy due to the array of different work backgrounds of their individuals. When synergy is established, a culture of continuous improvement is created.Keywords: knowledge transfer, continuous improvement, teamwork, cognitive assets
Procedia PDF Downloads 324195 Content Analysis of ‘Junk Food’ Content in Children’s TV Programmes: A Comparison of UK Broadcast TV and Video-On-Demand Services
Authors: Shreesh Sinha, Alexander B. Barker, Megan Parkin, Emma Wilson, Rachael L. Murray
Abstract:
Background and Objectives: Exposure to HFSS imagery is associated with the consumption of foods high in fat, sugar or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children's television channels in the UK and the other of a selection of children's programmes available on video-on-demand (VOD) streaming sites. Methods: Content analysis of three days' worth of programmes (including advertisements) on two popular children's television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children's programmes available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, 'Milkshake' had a significantly higher proportion of programmes/adverts which contained HFSS content than 'CBeebies'. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programmes containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children's television channels and children's programmes on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children's television programmes broadcast either on TV or VOD services are likely to have an effect on HFSS consumption in children, and legislative opportunities to prevent this exposure are being missed.Keywords: public health, junk food, children's TV, HFSS
Procedia PDF Downloads 102194 Compliance of Systematic Reviews in Plastic Surgery with the PRISMA Statement: A Systematic Review
Authors: Seon-Young Lee, Harkiran Sagoo, Katherine Whitehurst, Georgina Wellstead, Alexander Fowler, Riaz Agha, Dennis Orgill
Abstract:
Introduction: Systematic reviews attempt to answer research questions by synthesising the data within primary papers. They are an increasingly important tool within evidence-based medicine, guiding clinical practice, future research and healthcare policy. We sought to determine the reporting quality of recent systematic reviews in plastic surgery. Methods: This systematic review was conducted in line with the Cochrane handbook, reported in line with the PRISMA statement and registered at the ResearchRegistry (UIN: reviewregistry18). MEDLINE and EMBASE databases were searched in 2013 and 2014 for systematic reviews by five major plastic surgery journals. Screening, identification and data extraction was performed independently by two teams. Results: From an initial set of 163 articles, 79 met the inclusion criteria. The median PRISMA score was 16 out of 27 items (59.3%; range 6-26, 95% CI 14-17). Compliance between individual PRISMA items showed high variability. It was poorest for items related to the use of review protocol (item 5; 5%) and presentation of data on risk of bias of each study (item 19; 18%), while being the highest for description of rationale (item 3; 99%) and sources of funding and other support (item 27; 95%), and for structured summary in the abstract (item 2; 95%). Conclusion: The reporting quality of systematic reviews in plastic surgery requires improvement. ‘Hard-wiring’ of compliance through journal submission systems, as well as improved education, awareness and a cohesive strategy among all stakeholders is called for.Keywords: PRISMA, reporting quality, plastic surgery, systematic review, meta-analysis
Procedia PDF Downloads 294193 Estimation of Endogenous Brain Noise from Brain Response to Flickering Visual Stimulation Magnetoencephalography Visual Perception Speed
Authors: Alexander N. Pisarchik, Parth Chholak
Abstract:
Intrinsic brain noise was estimated via magneto-encephalograms (MEG) recorded during perception of flickering visual stimuli with frequencies of 6.67 and 8.57 Hz. First, we measured the mean phase difference between the flicker signal and steady-state event-related field (SSERF) in the occipital area where the brain response at the flicker frequencies and their harmonics appeared in the power spectrum. Then, we calculated the probability distribution of the phase fluctuations in the regions of frequency locking and computed its kurtosis. Since kurtosis is a measure of the distribution’s sharpness, we suppose that inverse kurtosis is related to intrinsic brain noise. In our experiments, the kurtosis value varied among subjects from K = 3 to K = 5 for 6.67 Hz and from 2.6 to 4 for 8.57 Hz. The majority of subjects demonstrated leptokurtic kurtosis (K < 3), i.e., the distribution tails approached zero more slowly than Gaussian. In addition, we found a strong correlation between kurtosis and brain complexity measured as the correlation dimension, so that the MEGs of subjects with higher kurtosis exhibited lower complexity. The obtained results are discussed in the framework of nonlinear dynamics and complex network theories. Specifically, in a network of coupled oscillators, phase synchronization is mainly determined by two antagonistic factors, noise, and the coupling strength. While noise worsens phase synchronization, the coupling improves it. If we assume that each neuron and each synapse contribute to brain noise, the larger neuronal network should have stronger noise, and therefore phase synchronization should be worse, that results in smaller kurtosis. The described method for brain noise estimation can be useful for diagnostics of some brain pathologies associated with abnormal brain noise.Keywords: brain, flickering, magnetoencephalography, MEG, visual perception, perception time
Procedia PDF Downloads 148192 Sports Business Services Model: A Research Model Study in Reginal Sport Authority of Thailand
Authors: Siriraks Khawchaimaha, Sangwian Boonto
Abstract:
Sport Authority of Thailand (SAT) is the state enterprise, promotes and supports all sports kind both professional and athletes for competitions, and administer under government policy and government officers and therefore, all financial supports whether cash inflows and cash outflows are strictly committed to government budget and limited to the planned projects at least 12 to 16 months ahead of reality, as results of ineffective in sport events, administration and competitions. In order to retain in the sports challenges around the world, SAT need to has its own sports business services model by each stadium, region and athletes’ competencies. Based on the HMK model of Khawchaimaha, S. (2007), this research study is formalized into each 10 regional stadiums to details into the characteristics root of fans, athletes, coaches, equipments and facilities, and stadiums. The research designed is firstly the evaluation of external factors: hardware whereby competition or practice of stadiums, playground, facilities, and equipments. Secondly, to understand the software of the organization structure, staffs and management, administrative model, rules and practices. In addition, budget allocation and budget administration with operating plan and expenditure plan. As results for the third step, issues and limitations which require action plan for further development and support, or to cease that unskilled sports kind. The final step, based on the HMK model and modeling canvas by Alexander O and Yves P (2010) are those of template generating Sports Business Services Model for each 10 SAT’s regional stadiums.Keywords: HMK model, not for profit organization, sport business model, sport services model
Procedia PDF Downloads 305191 Trading off Accuracy for Speed in Powerdrill
Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica
Abstract:
In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries
Procedia PDF Downloads 259190 A Multicriteria Analysis of Energy Poverty Index: A Case Study of Non-interconnected Zones in Colombia
Authors: Angelica Gonzalez O, Leonardo Rivera Cadavid, Diego Fernando Manotas
Abstract:
Energy poverty considers a population that does not have access to modern energy service. In particular, an area of a country that is not connected to the national electricity grid is known as a Non-Interconnected Zone (NIZ). Access to electricity has a significant impact on the welfare and development opportunities of the population. Different studies have shown that most health problems have an empirical cause and effect relationship with multidimensional energy poverty. Likewise, research has been carried out to review the consequences of not having access to electricity, and its results have concluded a statistically significant relationship between energy poverty and sources of drinking water, access to clean water, risks of mosquito bites, obesity, sterilization, marital status, occupation, and residence. Therefore, extensive research has been conducted in the construction of an energy poverty measure based on an index. Some of these studies introduce a Multidimensional Energy Poverty Index (MEPI), Compose Energy Poverty Index (CEPI), Low Income High Costs indicator (LIHC), among others. For this purpose, this study analyzes the energy poverty index using a multicriteria analysis determining the set of feasible alternatives - for which Colombia's ZNI will be used as a case study - to be considered in the problem and the set of relevant criteria in the characterization of the ZNI, from which the prioritization is obtained to determine the level of adjustment of each alternative with respect to the performance in each criterion. Additionally, this study considers the installation of Micro-Grids (MG). This is considered a straightforward solution to this problem because an MG is a local electrical grid, able to operate in grid-connected and island mode. Drawing on those insights, this study compares an energy poverty index considering an MG installation and calculates the impacts of different criterias in an energy poverty index in NIZ.Keywords: multicirteria, energy poverty, rural, microgrids, non-interconnect zones
Procedia PDF Downloads 117189 TutorBot+: Automatic Programming Assistant with Positive Feedback based on LLMs
Authors: Claudia Martínez-Araneda, Mariella Gutiérrez, Pedro Gómez, Diego Maldonado, Alejandra Segura, Christian Vidal-Castro
Abstract:
The purpose of this document is to showcase the preliminary work in developing an EduChatbot-type tool and measuring the effects of its use aimed at providing effective feedback to students in programming courses. This bot, hereinafter referred to as tutorBot+, was constructed based on chatGPT and is tasked with assisting and delivering timely positive feedback to students in the field of computer science at the Universidad Católica de Concepción. The proposed working method consists of four stages: (1) Immersion in the domain of Large Language Models (LLMs), (2) Development of the tutorBot+ prototype and integration, (3) Experiment design, and (4) Intervention. The first stage involves a literature review on the use of artificial intelligence in education and the evaluation of intelligent tutors, as well as research on types of feedback for learning and the domain of chatGPT. The second stage encompasses the development of tutorBot+, and the final stage involves a quasi-experimental study with students from the Programming and Database labs, where the learning outcome involves the development of computational thinking skills, enabling the use and measurement of the tool's effects. The preliminary results of this work are promising, as a functional chatBot prototype has been developed in both conversational and non-conversational versions integrated into an open-source online judge and programming contest platform system. There is also an exploration of the possibility of generating a custom model based on a pre-trained one tailored to the domain of programming. This includes the integration of the created tool and the design of the experiment to measure its utility.Keywords: assessment, chatGPT, learning strategies, LLMs, timely feedback
Procedia PDF Downloads 68188 A Content Analysis of ‘Junk Food’ Content in Children’s TV Programs: A Comparison of UK Broadcast TV and Video-On-Demand Services
Authors: Alexander B. Barker, Megan Parkin, Shreesh Sinha, Emma Wilson, Rachael L. Murray
Abstract:
Objectives: Exposure to HFSS imagery is associated with consumption of foods high in fat, sugar, or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children’s television channels in the UK and the other of a selection of children’s programs available on video-on-demand (VOD) streaming sites. Design: Content analysis of three days’ worth of programs (including advertisements) on two popular children’s television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children’s programs available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Setting: United Kingdom, Participants: None. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, ‘Milkshake’ had a significantly higher proportion of programs/adverts which contained HFSS content than ‘CBeebies’. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programs containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children’s television channels and children's programs on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children’s television programs broadcast either on TV or VOD services are likely having an effect on HFSS consumption in children and legislative opportunities to prevent this exposure are being missed.Keywords: public health, epidemiology, obesity, content analysis
Procedia PDF Downloads 187187 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 64186 Modified Clusterwise Regression for Pavement Management
Authors: Mukesh Khadka, Alexander Paz, Hanns de la Fuente-Mella
Abstract:
Typically, pavement performance models are developed in two steps: (i) pavement segments with similar characteristics are grouped together to form a cluster, and (ii) the corresponding performance models are developed using statistical techniques. A challenge is to select the characteristics that define clusters and the segments associated with them. If inappropriate characteristics are used, clusters may include homogeneous segments with different performance behavior or heterogeneous segments with similar performance behavior. Prediction accuracy of performance models can be improved by grouping the pavement segments into more uniform clusters by including both characteristics and a performance measure. This grouping is not always possible due to limited information. It is impractical to include all the potential significant factors because some of them are potentially unobserved or difficult to measure. Historical performance of pavement segments could be used as a proxy to incorporate the effect of the missing potential significant factors in clustering process. The current state-of-the-art proposes Clusterwise Linear Regression (CLR) to determine the pavement clusters and the associated performance models simultaneously. CLR incorporates the effect of significant factors as well as a performance measure. In this study, a mathematical program was formulated for CLR models including multiple explanatory variables. Pavement data collected recently over the entire state of Nevada were used. International Roughness Index (IRI) was used as a pavement performance measure because it serves as a unified standard that is widely accepted for evaluating pavement performance, especially in terms of riding quality. Results illustrate the advantage of the using CLR. Previous studies have used CLR along with experimental data. This study uses actual field data collected across a variety of environmental, traffic, design, and construction and maintenance conditions.Keywords: clusterwise regression, pavement management system, performance model, optimization
Procedia PDF Downloads 251185 More Precise: Patient-Reported Outcomes after Stroke
Authors: Amber Elyse Corrigan, Alexander Smith, Anna Pennington, Ben Carter, Jonathan Hewitt
Abstract:
Background and Purpose: Morbidity secondary to stroke is highly heterogeneous, but it is important to both patients and clinicians in post-stroke management and adjustment to life after stroke. The consideration of post-stroke morbidity clinically and from the patient perspective has been poorly measured. The patient-reported outcome measures (PROs) in morbidity assessment help improve this knowledge gap. The primary aim of this study was to consider the association between PRO outcomes and stroke predictors. Methods: A multicenter prospective cohort study assessed 549 stroke patients at 19 hospital sites across England and Wales during 2019. Following a stroke event, demographic, clinical, and PRO measures were collected. Prevalence of morbidity within PRO measures was calculated with associated 95% confidence intervals. Predictors of domain outcome were calculated using a multilevel generalized linear model. Associated P -values and 95% confidence intervals are reported. Results: Data were collected from 549 participants, 317 men (57.7%) and 232 women (42.3%) with ages ranging from 25 to 97 (mean 72.7). PRO morbidity was high post-stroke; 93.2% of the cohort report post-stroke PRO morbidity. Previous stroke, diabetes, and gender are associated with worse patient-reported outcomes across both the physical and cognitive domains. Conclusions: This large-scale multicenter cohort study illustrates the high proportion of morbidity in PRO measures. Further, we demonstrate key predictors of adverse outcomes (Diabetes, previous stroke, and gender) congruence with clinical predictors. The PRO has been demonstrated to be an informative and useful stroke when considering patient-reported outcomes and has wider implications for considerations of PROs in clinical management. Future longitudinal follow-up with PROs is needed to consider association of long-term morbidity.Keywords: morbidity, patient-reported outcome, PRO, stroke
Procedia PDF Downloads 131184 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms
Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann
Abstract:
Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI
Procedia PDF Downloads 180183 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing
Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill
Abstract:
In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.Keywords: idea ontology, innovation management, semantic search, open information extraction
Procedia PDF Downloads 188182 Visualization Tool for EEG Signal Segmentation
Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh
Abstract:
This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation
Procedia PDF Downloads 397