Search results for: AnyLogic platform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1990

Search results for: AnyLogic platform

70 Surveillance of Artemisinin Resistance Markers and Their Impact on Treatment Outcomes in Malaria Patients in an Endemic Area of South-Western Nigeria

Authors: Abiodun Amusan, Olugbenga Akinola, Kazeem Akano, María Hernández-Castañeda, Jenna Dick, Akintunde Sowunmi, Geoffrey Hart, Grace Gbotosho

Abstract:

Introduction: Artemisinin-based Combination Therapy (ACTs) is the cornerstone malaria treatment option in most malaria-endemic countries. Unfortunately, the malaria control effort is constantly being threatened by resistance of Plasmodium falciparum to ACTs. The recent evidence of artemisinin resistance in East Africa and its possibility of spreading to other African regions portends an imminent health catastrophe. This study aimed at evaluating the occurrence, prevalence, and influence of artemisinin-resistance markers on treatment outcomes in Ibadan before and after post-adoption of artemisinin combination therapy (ACTs) in Nigeria in 2005. Method: The study involved day zero dry blood spot (DBS) obtained from malaria patients during retrospective (2000-2005) and prospective (2021) studies. A cohort in the prospective study received oral dihydroartemisinin-piperaquine and underwent a 42-day follow-up to observe treatment outcomes. Genomic DNA was extracted from the DBS samples using a QIAamp blood extraction kit. Fragments of P. falciparum kelch13 (Pfkelch13), P. falciparum coronin (Pfcoronin), P. falciparum multidrug resistance 2 (PfMDR2), and P. falciparum chloroquine resistance transporter (PfCRT) genes were amplified and sequenced on a sanger sequencing platform to identify artemisinin resistance-associated mutations. Mutations were identified by aligning sequenced data with reference sequences obtained from the National Center for Biotechnology Information. Data were analyzed using descriptive statistics and student t-tests. Results: Mean parasite clearance time (PCT) and fever clearance time (FCT) were 2.1 ± 0.6 days (95% CI: 1.97-2.24) and 1.3 ± 0.7 days (95% CI: 1.1-1.6) respectively. Four mutations, K189T [34/53(64.2%)], R255K [2/53(3.8%)], K189N [1/53(1.9%)] and N217H [1/53(1.9%)] were identified within the N-terminal (Coiled-coil containing) domain of Pfkelch13. No artemisinin resistance-associated mutation usually found within the β-propeller domain of the Pfkelch13 gene was found in these analyzed samples. However, K189T and R255K mutations showed a significant correlation with longer parasite clearance time in the patients (P<0.002). The observed Pfkelch13 gene changes did not influence the baseline mean parasitemia (P = 0.44). P76S [17/100 (17%)] and V62M [1/100 (1%)] changes were identified in the Pfcoronin gene fragment without any influence on the parasitological parameters. No change was observed in the PfMDR2 gene, while no artemisinin resistance-associated mutation was found in the PfCRT gene. Furthermore, a sample each in the retrospective study contained the Pfkelch13 K189T and Pfcoronin P76S mutations. Conclusion: The study revealed absence of genetic-based evidence of artemisinin resistance in the study population at the time of study. The high frequency of K189T Pfkelch13 mutation and its correlation with increased parasite clearance time in this study may depict geographical variation of resistance mediators and imminent artemisinin resistance, respectively. The study also revealed an inherent potential of parasites to harbour drug-resistant genotypes before the introduction of ACTs in Nigeria.

Keywords: artemisinin resistance, plasmodium falciparum, Pfkelch13 mutations, Pfcoronin

Procedia PDF Downloads 49
69 Integrating Data Mining within a Strategic Knowledge Management Framework: A Platform for Sustainable Competitive Advantage within the Australian Minerals and Metals Mining Sector

Authors: Sanaz Moayer, Fang Huang, Scott Gardner

Abstract:

In the highly leveraged business world of today, an organisation’s success depends on how it can manage and organize its traditional and intangible assets. In the knowledge-based economy, knowledge as a valuable asset gives enduring capability to firms competing in rapidly shifting global markets. It can be argued that ability to create unique knowledge assets by configuring ICT and human capabilities, will be a defining factor for international competitive advantage in the mid-21st century. The concept of KM is recognized in the strategy literature, and increasingly by senior decision-makers (particularly in large firms which can achieve scalable benefits), as an important vehicle for stimulating innovation and organisational performance in the knowledge economy. This thinking has been evident in professional services and other knowledge intensive industries for over a decade. It highlights the importance of social capital and the value of the intellectual capital embedded in social and professional networks, complementing the traditional focus on creation of intellectual property assets. Despite the growing interest in KM within professional services there has been limited discussion in relation to multinational resource based industries such as mining and petroleum where the focus has been principally on global portfolio optimization with economies of scale, process efficiencies and cost reduction. The Australian minerals and metals mining industry, although traditionally viewed as capital intensive, employs a significant number of knowledge workers notably- engineers, geologists, highly skilled technicians, legal, finance, accounting, ICT and contracts specialists working in projects or functions, representing potential knowledge silos within the organisation. This silo effect arguably inhibits knowledge sharing and retention by disaggregating corporate memory, with increased operational and project continuity risk. It also may limit the potential for process, product, and service innovation. In this paper the strategic application of knowledge management incorporating contemporary ICT platforms and data mining practices is explored as an important enabler for knowledge discovery, reduction of risk, and retention of corporate knowledge in resource based industries. With reference to the relevant strategy, management, and information systems literature, this paper highlights possible connections (currently undergoing empirical testing), between an Strategic Knowledge Management (SKM) framework incorporating supportive Data Mining (DM) practices and competitive advantage for multinational firms operating within the Australian resource sector. We also propose based on a review of the relevant literature that more effective management of soft and hard systems knowledge is crucial for major Australian firms in all sectors seeking to improve organisational performance through the human and technological capability captured in organisational networks.

Keywords: competitive advantage, data mining, mining organisation, strategic knowledge management

Procedia PDF Downloads 415
68 Global Winners versus Local Losers: Globalization Identity and Tradition in Spanish Club Football

Authors: Jim O'brien

Abstract:

Contemporary global representation and consumption of La Liga across a plethora of media platform outlets has resulted in significant implications for the historical, political and cultural developments which shaped the development of Spanish club football. This has established and reinforced a hierarchy of a small number of teams belonging to or aspiring to belong to a cluster of global elite clubs seeking to imitate the blueprint of the English Premier League in respect of corporate branding and marketing in order to secure a global fan base through success and exposure in La Liga itself and through the Champions League. The synthesis between globalization, global sport and the status of high profile clubs has created radical change within the folkloric iconography of Spanish football. The main focus of this paper is to critically evaluate the consequences of globalization on the rich tapestry at the core of the game’s distinctive history in Spain. The seminal debate underpinning the study considers whether the divergent aspects of globalization have acted as a malevolent force, eroding tradition, causing financial meltdown and reducing much of the fabric of club football to the status of by standers, or have promoted a renaissance of these traditions, securing their legacies through new fans and audiences. The study draws on extensive sources on the history, politics and culture of Spanish football, in both English and Spanish. It also uses primary and archive material derived from interviews and fieldwork undertaken with scholars, media professionals and club representatives in Spain. The paper has four main themes. Firstly, it contextualizes the key historical, political and cultural forces which shaped the landscape of Spanish football from the late nineteenth century. The seminal notions of region, locality and cultural divergence are pivotal to this discourse. The study then considers the relationship between football, ethnicity and identity as a barometer of continuity and change, suggesting that tradition is being reinvented and re-framed to reflect the shifting demographic and societal patterns within the Spanish state. Following on from this, consideration is given to the paradoxical function of ‘El Clasico’ and the dominant duopoly of the FC Barcelona – Real Madrid axis in both eroding tradition in the global nexus of football’s commodification and in protecting historic political rivalries. To most global consumers of La Liga, the mega- spectacle and hyperbole of ‘El Clasico’ is the essence of Spanish football, with cultural misrepresentation and distortion catapulting the event to the global media audience. Finally, the paper examines La Liga as a sporting phenomenon in which elite clubs, cult managers and galacticos serve as commodities on the altar of mass consumption in football’s global entertainment matrix. These processes accentuate a homogenous mosaic of cultural conformity which obscures local, regional and national identities and paradoxically fuses the global with the local to maintain the distinctive hue of La Liga, as witnessed by the extraordinary successes of Athletico Madrid and FC Eibar in recent seasons.

Keywords: Spanish football, globalization, cultural identity, tradition, folklore

Procedia PDF Downloads 301
67 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning

Authors: Xingyu Gao, Qiang Wu

Abstract:

Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.

Keywords: patent influence, interpretable machine learning, predictive models, SHAP

Procedia PDF Downloads 50
66 Interactive Virtual Patient Simulation Enhances Pharmacology Education and Clinical Practice

Authors: Lyndsee Baumann-Birkbeck, Sohil A. Khan, Shailendra Anoopkumar-Dukie, Gary D. Grant

Abstract:

Technology-enhanced education tools are being rapidly integrated into health programs globally. These tools provide an interactive platform for students and can be used to deliver topics in various modes including games and simulations. Simulations are of particular interest to healthcare education, where they are employed to enhance clinical knowledge and help to bridge the gap between theory and practice. Simulations will often assess competencies for practical tasks, yet limited research examines the effects of simulation on student perceptions of their learning. The aim of this study was to determine the effects of an interactive virtual patient simulation for pharmacology education and clinical practice on student knowledge, skills and confidence. Ethics approval for the study was obtained from Griffith University Research Ethics Committee (PHM/11/14/HREC). The simulation was intended to replicate the pharmacy environment and patient interaction. The content was designed to enhance knowledge of proton-pump inhibitor pharmacology, role in therapeutics and safe supply to patients. The tool was deployed into a third-year clinical pharmacology and therapeutics course. A number of core practice areas were examined including the competency domains of questioning, counselling, referral and product provision. Baseline measures of student self-reported knowledge, skills and confidence were taken prior to the simulation using a specifically designed questionnaire. A more extensive questionnaire was deployed following the virtual patient simulation, which also included measures of student engagement with the activity. A quiz assessing student factual and conceptual knowledge of proton-pump inhibitor pharmacology and related counselling information was also included in both questionnaires. Sixty-one students (response rate >95%) from two cohorts (2014 and 2015) participated in the study. Chi-square analyses were performed and data analysed using Fishers exact test. Results demonstrate that student knowledge, skills and confidence within the competency domains of questioning, counselling, referral and product provision, show improvement following the implementation of the virtual patient simulation. Statistically significant (p<0.05) improvement occurred in ten of the possible twelve self-reported measurement areas. Greatest magnitude of improvement occurred in the area of counselling (student confidence p<0.0001). Student confidence in all domains (questioning, counselling, referral and product provision) showed a marked increase. Student performance in the quiz also improved, demonstrating a 10% improvement overall for pharmacology knowledge and clinical practice following the simulation. Overall, 85% of students reported the simulation to be engaging and 93% of students felt the virtual patient simulation enhanced learning. The data suggests that the interactive virtual patient simulation developed for clinical pharmacology and therapeutics education enhanced students knowledge, skill and confidence, with respect to the competency domains of questioning, counselling, referral and product provision. These self-reported measures appear to translate to learning outcomes, as demonstrated by the improved student performance in the quiz assessment item. Future research of education using virtual simulation should seek to incorporate modern quantitative measures of student learning and engagement, such as eye tracking.

Keywords: clinical simulation, education, pharmacology, simulation, virtual learning

Procedia PDF Downloads 338
65 Role of Functional Divergence in Specific Inhibitor Design: Using γ-Glutamyltranspeptidase (GGT) as a Model Protein

Authors: Ved Vrat Verma, Rani Gupta, Manisha Goel

Abstract:

γ-glutamyltranspeptidase (GGT: EC 2.3.2.2) is an N-terminal nucleophile hydrolase conserved in all three domains of life. GGT plays a key role in glutathione metabolism where it catalyzes the breakage of the γ-glutamyl bonds and transfer of γ-glutamyl group to water (hydrolytic activity) or amino acids or short peptides (transpeptidase activity). GGTs from bacteria, archaea, and eukaryotes (human, rat and mouse) are homologous proteins sharing >50% sequence similarity and conserved four layered αββα sandwich like three dimensional structural fold. These proteins though similar in their structure to each other, are quite diverse in their enzyme activity: some GGTs are better at hydrolysis reactions but poor in transpeptidase activity, whereas many others may show opposite behaviour. GGT is known to be involved in various diseases like asthma, parkinson, arthritis, and gastric cancer. Its inhibition prior to chemotherapy treatments has been shown to sensitize tumours to the treatment. Microbial GGT is known to be a virulence factor too, important for the colonization of bacteria in host. However, all known inhibitors (mimics of its native substrate, glutamate) are highly toxic because they interfere with other enzyme pathways. However, a few successful efforts have been reported previously in designing species specific inhibitors. We aim to leverage the diversity seen in GGT family (pathogen vs. eukaryotes) for designing specific inhibitors. Thus, in the present study, we have used DIVERGE software to identify sites in GGT proteins, which are crucial for the functional and structural divergence of these proteins. Since, type II divergence sites vary in clade specific manner, so type II divergent sites were our focus of interest throughout the study. Type II divergent sites were identified for pathogen vs. eukaryotes clusters and sites were marked on clade specific representative structures HpGGT (2QM6) and HmGGT (4ZCG) of pathogen and eukaryotes clade respectively. The crucial divergent sites within 15 A radii of the binding cavity were highlighted, and in-silico mutations were performed on these sites to delineate the role of these sites on the mechanism of catalysis and protein folding. Further, the amino acid network (AAN) analysis was also performed by Cytoscape to delineate assortative mixing for cavity divergent sites which could strengthen our hypothesis. Additionally, molecular dynamics simulations were performed for wild complexes and mutant complexes close to physiological conditions (pH 7.0, 0.1 M ionic strength and 1 atm pressure) and the role of putative divergence sites and structural integrities of the homologous proteins have been analysed. The dynamics data were scrutinized in terms of RMSD, RMSF, non-native H-bonds and salt bridges. The RMSD, RMSF fluctuations of proteins complexes are compared, and the changes at protein ligand binding sites were highlighted. The outcomes of our study highlighted some crucial divergent sites which could be used for novel inhibitors designing in a species-specific manner. Since, for drug development, it is challenging to design novel drug by targeting similar protein which exists in eukaryotes, so this study could set up an initial platform to overcome this challenge and help to deduce the more effective targets for novel drug discovery.

Keywords: γ-glutamyltranspeptidase, divergence, species-specific, drug design

Procedia PDF Downloads 268
64 Comparisons of Drop Jump and Countermovement Jump Performance for Male Basketball Players with and without Low-Dye Taping Application

Authors: Chung Yan Natalia Yeung, Man Kit Indy Ho, Kin Yu Stan Chan, Ho Pui Kipper Lam, Man Wah Genie Tong, Tze Chung Jim Luk

Abstract:

Excessive foot pronation is a well-known risk factor of knee and foot injuries such as patellofemoral pain, patellar and Achilles tendinopathy, and plantar fasciitis. Low-Dye taping (LDT) application is not uncommon for basketball players to control excessive foot pronation for pain control and injury prevention. The primary potential benefits of using LDT include providing additional supports to medial longitudinal arch and restricting the excessive midfoot and subtalar motion in weight-bearing activities such as running and landing. Meanwhile, restrictions provided by the rigid tape may also potentially limit functional joint movements and sports performance. Coaches and athletes need to weigh the potential benefits and harmful effects before making a decision if applying LDT technique is worthwhile or not. However, the influence of using LDT on basketball-related performance such as explosive and reactive strength is not well understood. Therefore, the purpose of this study was to investigate the change of drop jump (DJ) and countermovement jump (CMJ) performance before and after LDT application for collegiate male basketball players. In this within-subject crossover study, 12 healthy male basketball players (age: 21.7 ± 2.5 years) with at least 3-year regular basketball training experience were recruited. Navicular drop (ND) test was adopted as the screening and only those with excessive pronation (ND ≥ 10mm) were included. Participants with recent lower limb injury history were excluded. Recruited subjects were required to perform both ND, DJ (on a platform of 40cm height) and CMJ (without arms swing) tests in series during taped and non-taped conditions in the counterbalanced order. Reactive strength index (RSI) was calculated by using the flight time divided by the ground contact time measured. For DJ and CMJ tests, the best of three trials was used for analysis. The difference between taped and non-taped conditions for each test was further calculated through standardized effect ± 90% confidence intervals (CI) with clinical magnitude-based inference (MBI). Paired samples T-test showed significant decrease in ND (-4.68 ± 1.44mm; 95% CI: -3.77, -5.60; p < 0.05) while MBI demonstrated most likely beneficial and large effect (standardize effect: -1.59 ± 0.27) in LDT condition. For DJ test, significant increase in both flight time (25.25 ± 29.96ms; 95% CI: 6.22, 44.28; p < 0.05) and RSI (0.22 ± 0.22; 95% CI: 0.08, 0.36; p < 0.05) were observed. In taped condition, MBI showed very likely beneficial and moderate effect (standardized effect: 0.77 ± 0.49) in flight time, possibly beneficial and small effect (standardized effect: -0.26 ± 0.29) in ground contact time and very likely beneficial and moderate effect (standardized effect: 0.77 ± 0.42) in RSI. No significant difference in CMJ was observed (95% CI: -2.73, 2.08; p > 0.05). For basketball players with pes planus, applying LDT could substantially support the foot by elevating the navicular height and potentially provide acute beneficial effects in reactive strength performance. Meanwhile, no significant harmful effect on CMJ was observed. Basketball players may consider applying LDT before the game or training to enhance the reactive strength performance. However since the observed effects in this study could not generalize to other players without excessive foot pronation, further studies on players with normal foot arch or navicular height are recommended.

Keywords: flight time, pes planus, pronated foot, reactive strength index

Procedia PDF Downloads 155
63 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion

Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro

Abstract:

In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.

Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment

Procedia PDF Downloads 43
62 Management of the Experts in the Research Evaluation System of the University: Based on National Research University Higher School of Economics Example

Authors: Alena Nesterenko, Svetlana Petrikova

Abstract:

Research evaluation is one of the most important elements of self-regulation and development of researchers as it is impartial and independent process of assessment. The method of expert evaluations as a scientific instrument solving complicated non-formalized problems is firstly a scientifically sound way to conduct the assessment which maximum effectiveness of work at every step and secondly the usage of quantitative methods for evaluation, assessment of expert opinion and collective processing of the results. These two features distinguish the method of expert evaluations from long-known expertise widespread in many areas of knowledge. Different typical problems require different types of expert evaluations methods. Several issues which arise with these methods are experts’ selection, management of assessment procedure, proceeding of the results and remuneration for the experts. To address these issues an on-line system was created with the primary purpose of development of a versatile application for many workgroups with matching approaches to scientific work management. Online documentation assessment and statistics system allows: - To realize within one platform independent activities of different workgroups (e.g. expert officers, managers). - To establish different workspaces for corresponding workgroups where custom users database can be created according to particular needs. - To form for each workgroup required output documents. - To configure information gathering for each workgroup (forms of assessment, tests, inventories). - To create and operate personal databases of remote users. - To set up automatic notification through e-mail. The next stage is development of quantitative and qualitative criteria to form a database of experts. The inventory was made so that the experts may not only submit their personal data, place of work and scientific degree but also keywords according to their expertise, academic interests, ORCID, Researcher ID, SPIN-code RSCI, Scopus AuthorID, knowledge of languages, primary scientific publications. For each project, competition assessments are processed in accordance to ordering party demands in forms of apprised inventories, commentaries (50-250 characters) and overall review (1500 characters) in which expert states the absence of conflict of interest. Evaluation is conducted as follows: as applications are added to database expert officer selects experts, generally, two persons per application. Experts are selected according to the keywords; this method proved to be good unlike the OECD classifier. The last stage: the choice of the experts is approved by the supervisor, the e-mails are sent to the experts with invitation to assess the project. An expert supervisor is controlling experts writing reports for all formalities to be in place (time-frame, propriety, correspondence). If the difference in assessment exceeds four points, the third evaluation is appointed. As the expert finishes work on his expert opinion, system shows contract marked ‘new’, managers commence with the contract and the expert gets e-mail that the contract is formed and ready to be signed. All formalities are concluded and the expert gets remuneration for his work. The specificity of interaction of the examination officer with other experts will be presented in the report.

Keywords: expertise, management of research evaluation, method of expert evaluations, research evaluation

Procedia PDF Downloads 207
61 Simultech - Innovative Country-Wide Ultrasound Training Center

Authors: Yael Rieder, Yael Gilboa, S. O. Adva, Efrat Halevi, Ronnie Tepper

Abstract:

Background: Operation of ultrasound equipment is a core skill for many clinical specialties. As part of the training program at -Simultech- a simulation center for Ob\Gyn at the Meir Medical Center, Israel, teaching how to operate ultrasound equipment requires dealing with misunderstandings of spatial and 3D orientation, failure of the operator to hold a transducer correctly, and limited ability to evaluate the data on the screen. We have developed a platform intended to endow physicians and sonographers with clinical and operational skills of obstetric ultrasound. Simultech's simulations are focused on medical knowledge, risk management, technology operations and physician-patient communication. The simulations encompass extreme work conditions. Setup: Between eight and ten of the eight hundred and fifty physicians and sonographers of the Clalit health services from seven hospitals and eight community centers across Israel, participate in individual Ob/Gyn training sessions each week. These include Ob/Gyn specialists, experts, interns, and sonographers. Innovative teaching and training methodologies: The six-hour training program includes: (1) An educational computer program that challenges trainees to deal with medical questions based upon ultrasound pictures and films. (2) Sophisticated hands-on simulators that challenge the trainees to practice correct grip of the transducer, elucidate pathology, and practice daily tasks such as biometric measurements and analysis of sonographic data. (3) Participation in a video-taped simulation which focuses on physician-patient communications. In the simulation, the physician is required to diagnose the clinical condition of a hired actress based on the data she provides and by evaluating the assigned ultrasound films accordingly. Giving ‘bad news’ to the patient may put the physician in a stressful situation that must be properly managed. (4) Feedback at the end of each phase is provided by a designated trainer, not a physician, who is specially qualified by Ob\Gyn senior specialists. (5) A group exercise in which the trainer presents a medico-legal case in order to encourage the participants to use their own experience and knowledge to conduct a productive ‘brainstorming’ session. Medical cases are presented and analyzed by the participants together with the trainer's feedback. Findings: (1) The training methods and content that Simultech provides allows trainees to review their medical and communications skills. (2) Simultech training sessions expose physicians to both basic and new, up-to-date cases, refreshing and expanding the trainee's knowledge. (3) Practicing on advanced simulators enables trainees to understand the sonographic space and to implement the basic principles of ultrasound. (4) Communications simulations were found to be beneficial for trainees who were unaware of their interpersonal skills. The trainer feedback, supported by the recorded simulation, allows the trainee to draw conclusions about his performance. Conclusion: Simultech was found to contribute to physicians at all levels of clinical expertise who deal with ultrasound. A break in daily routine together with attendance at a neutral educational center can vastly improve performance and outlook.

Keywords: medical training, simulations, ultrasound, Simultech

Procedia PDF Downloads 280
60 From Avatars to Humans: A Hybrid World Theory and Human Computer Interaction Experimentations with Virtual Reality Technologies

Authors: Juan Pablo Bertuzzi, Mauro Chiarella

Abstract:

Employing a communication studies perspective and a socio-technological approach, this paper introduces a theoretical framework for understanding the concept of hybrid world; the avatarization phenomena; and the communicational archetype of co-hybridization. This analysis intends to make a contribution to future design of virtual reality experimental applications. Ultimately, this paper presents an ongoing research project that proposes the study of human-avatar interactions in digital educational environments, as well as an innovative reflection on inner digital communication. The aforementioned project presents the analysis of human-avatar interactions, through the development of an interactive experience in virtual reality. The goal is to generate an innovative communicational dimension that could reinforce the hypotheses presented throughout this paper. Being thought for its initial application in educational environments, the analysis and results of this research are dependent and have been prepared in regard of a meticulous planning of: the conception of a 3D digital platform; the interactive game objects; the AI or computer avatars; the human representation as hybrid avatars; and lastly, the potential of immersion, ergonomics and control diversity that can provide the virtual reality system and the game engine that were chosen. The project is divided in two main axes: The first part is the structural one, as it is mandatory for the construction of an original prototype. The 3D model is inspired by the physical space that belongs to an academic institution. The incorporation of smart objects, avatars, game mechanics, game objects, and a dialogue system will be part of the prototype. These elements have all the objective of gamifying the educational environment. To generate a continuous participation and a large amount of interactions, the digital world will be navigable both, in a conventional device and in a virtual reality system. This decision is made, practically, to facilitate the communication between students and teachers; and strategically, because it will help to a faster population of the digital environment. The second part is concentrated to content production and further data analysis. The challenge is to offer a scenario’s diversity that compels users to interact and to question their digital embodiment. The multipath narrative content that is being applied is focused on the subjects covered in this paper. Furthermore, the experience with virtual reality devices proposes users to experiment in a mixture of a seemingly infinite digital world and a small physical area of movement. This combination will lead the narrative content and it will be crucial in order to restrict user’s interactions. The main point is to stimulate and to grow in the user the need of his hybrid avatar’s help. By building an inner communication between user’s physicality and user’s digital extension, the interactions will serve as a self-guide through the gameworld. This is the first attempt to make explicit the avatarization phenomena and to further analyze the communicational archetype of co-hybridization. The challenge of the upcoming years will be to take advantage from these forms of generalized avatarization, in order to create awareness and establish innovative forms of hybridization.

Keywords: avatar, hybrid worlds, socio-technology, virtual reality

Procedia PDF Downloads 142
59 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets

Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe

Abstract:

Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.

Keywords: biomedical research, genomics, information systems, software

Procedia PDF Downloads 270
58 Fostering Non-Traditional Student Success in an Online Music Appreciation Course

Authors: Linda Fellag, Arlene Caney

Abstract:

E-learning has earned an essential place in academia because it promotes learner autonomy, student engagement, and technological aptitude, and allows for flexible learning. However, despite advantages, educators have been slower to embrace e-learning for ESL and other non-traditional students for fear that such students will not succeed without the direct faculty contact and academic support of face-to-face classrooms. This study aims to determine if a non-traditional student-friendly online course can produce student retention and performance rates that compare favorably with those of students in standard online sections of the same course aimed at traditional college-level students. One Music faculty member is currently collaborating with an English instructor to redesign an online college-level Music Appreciation course for non-traditional college students. At Community College of Philadelphia, Introduction to Music Appreciation was recently designated as one of the few college-level courses that advanced ESL, and developmental English students can take while completing their language studies. Beginning in Fall 2017, the course will be critical for international students who must maintain full-time student status under visa requirements. In its current online format, however, Music Appreciation is designed for traditional college students, and faculty who teach these sections have been reluctant to revise the course to address the needs of non-traditional students. Interestingly, presenters maintain that the online platform is the ideal place to develop language and college readiness skills in at-risk students while maintaining the course's curricular integrity. The two faculty presenters describe how curriculum rather than technology drives the redesign of the digitized music course, and self-study materials, guided assignments, and periodic assessments promote independent learning and comprehension of material. The 'scaffolded' modules allow ESL and developmental English students to build on prior knowledge, preview key vocabulary, discuss content, and complete graded tasks that demonstrate comprehension. Activities and assignments, in turn, enhance college success by allowing students to practice academic reading strategies, writing, speaking, and student-faculty and peer-peer communication and collaboration. The course components facilitate a comparison of student performance and retention in sections of the redesigned and existing online sections of Music Appreciation as well as in previous sections with at-risk students. Indirect, qualitative measures include student attitudinal surveys and evaluations. Direct, quantitative measures include withdrawal rates, tests of disciplinary knowledge, and final grades. The study will compare the outcomes of three cohorts in the two versions of the online course: ESL students, at-risk developmental students, and college-level students. These data will also be compared with retention and student outcomes data of the three cohorts in f2f Music Appreciation, which permitted non-traditional student enrollment from 1998-2005. During this eight-year period, the presenter addressed the problems of at-risk students by adding language and college success support, which resulted in strong retention and outcomes. The presenters contend that the redesigned course will produce favorable outcomes among all three cohorts because it contains components which proved successful with at-risk learners in f2f sections of the course. Results of their study will be published in 2019 after the redesigned online course has met for two semesters.

Keywords: college readiness, e-learning, music appreciation, online courses

Procedia PDF Downloads 176
57 Absorptive Capabilities in the Development of Biopharmaceutical Industry: The Case of Bioprocess Development and Research Unit, National Polytechnic Institute

Authors: Ana L. Sánchez Regla, Igor A. Rivera González, María del Pilar Monserrat Pérez Hernández

Abstract:

The ability of an organization to identify and get useful information from external sources, assimilate it, transform and apply to generate products or services with added value is called absorptive capacity. Absorptive capabilities contribute to have market opportunities to firms and get a leader position with respect to others competitors. The Bioprocess Development and Research Unit (UDIBI) is a Research and Development (R&D) laboratory that belongs to the National Polytechnic Institute (IPN), which is a higher education institute in Mexico. The UDIBI was created with the purpose of carrying out R and D activities for the Transferon®, a biopharmaceutical product developed and patented by IPN. The evolution of competence and scientific and technological platform made UDIBI expand its scope by providing technological services (preclínical studies and bio-compatibility evaluation) to the national pharmaceutical industry and biopharmaceutical industry. The relevance of this study is that those industries are classified as high scientific and technological intensity, and yet, after a review of the state of the art, there is only one study of absorption capabilities in biopharmaceutical industry with a similar scope to this research; in the case of Mexico, there is none. In addition to this, UDIBI belongs to a public university and its operation does not depend on the federal budget, but on the income generated by its external technological services. This fact represents a highly remarkable case in Mexico's public higher education context. This current doctoral research (2015-2019) is contextualized within a case study, its main objective is to identify and analyze the absorptive capabilities that characterise the UDIBI that allows it had become in a one of two third authorized laboratory by the sanitary authority in Mexico for developed bio-comparability studies to bio-pharmaceutical products. The development of this work in the field is divided into two phases. In a first phase, 15 interviews were conducted with the UDIBI personnel, covering management levels, heads of services, project leaders and laboratory personnel. These interviews were structured under a questionnaire, which was designed to integrate open questions and to a lesser extent, others, whose answers would be answered on a Likert-type rating scale. From the information obtained in this phase, a scientific article was made (in review and a proposal of presentation was submitted in different academic forums. A second stage will be made from the conduct of an ethnographic study within this organization under study that will last about 3 months. On the other hand, it is intended to carry out interviews with external actors around the UDIBI (suppliers, advisors, IPN officials, including contact with an academic specialized in absorption capacities to express their comments on this thesis. The inicial findings had shown two lines: i) exist institutional, technological and organizational management elements that encourage and/or limit the creation of absorption capacities in this scientific and technological laboratory and, ii) UDIBI has had created a set of multiple transfer technology of knowledge mechanisms which have had permitted to build a huge base of prior knowledge.

Keywords: absorptive capabilities, biopharmaceutical industry, high research and development intensity industries, knowledge management, transfer of knowledge

Procedia PDF Downloads 225
56 Differential Expression Analysis of Busseola fusca Larval Transcriptome in Response to Cry1Ab Toxin Challenge

Authors: Bianca Peterson, Tomasz J. Sańko, Carlos C. Bezuidenhout, Johnnie Van Den Berg

Abstract:

Busseola fusca (Fuller) (Lepidoptera: Noctuidae), the maize stem borer, is a major pest in sub-Saharan Africa. It causes economic damage to maize and sorghum crops and has evolved non-recessive resistance to genetically modified (GM) maize expressing the Cry1Ab insecticidal toxin. Since B. fusca is a non-model organism, very little genomic information is publicly available, and is limited to some cytochrome c oxidase I, cytochrome b, and microsatellite data. The biology of B. fusca is well-described, but still poorly understood. This, in combination with its larval-specific behavior, may pose problems for limiting the spread of current resistant B. fusca populations or preventing resistance evolution in other susceptible populations. As part of on-going research into resistance evolution, B. fusca larvae were collected from Bt and non-Bt maize in South Africa, followed by RNA isolation (15 specimens) and sequencing on the Illumina HiSeq 2500 platform. Quality of reads was assessed with FastQC, after which Trimmomatic was used to trim adapters and remove low quality, short reads. Trinity was used for the de novo assembly, whereas TransRate was used for assembly quality assessment. Transcript identification employed BLAST (BLASTn, BLASTp, and tBLASTx comparisons), for which two libraries (nucleotide and protein) were created from 3.27 million lepidopteran sequences. Several transcripts that have previously been implicated in Cry toxin resistance was identified for B. fusca. These included aminopeptidase N, cadherin, alkaline phosphatase, ATP-binding cassette transporter proteins, and mitogen-activated protein kinase. MEGA7 was used to align these transcripts to reference sequences from Lepidoptera to detect mutations that might potentially be contributing to Cry toxin resistance in this pest. RSEM and Bioconductor were used to perform differential gene expression analysis on groups of B. fusca larvae challenged and unchallenged with the Cry1Ab toxin. Pairwise expression comparisons of transcripts that were at least 16-fold expressed at a false-discovery corrected statistical significance (p) ≤ 0.001 were extracted and visualized in a hierarchically clustered heatmap using R. A total of 329,194 transcripts with an N50 of 1,019 bp were generated from the over 167.5 million high-quality paired-end reads. Furthermore, 110 transcripts were over 10 kbp long, of which the largest one was 29,395 bp. BLAST comparisons resulted in identification of 157,099 (47.72%) transcripts, among which only 3,718 (2.37%) were identified as Cry toxin receptors from lepidopteran insects. According to transcript expression profiles, transcripts were grouped into three subclusters according to the similarity of their expression patterns. Several immune-related transcripts (pathogen recognition receptors, antimicrobial peptides, and inhibitors) were up-regulated in the larvae feeding on Bt maize, indicating an enhanced immune status in response to toxin exposure. Above all, extremely up-regulated arylphorin genes suggest that enhanced epithelial healing is one of the resistance mechanisms employed by B. fusca larvae against the Cry1Ab toxin. This study is the first to provide a resource base and some insights into a potential mechanism of Cry1Ab toxin resistance in B. fusca. Transcriptomic data generated in this study allows identification of genes that can be targeted by biotechnological improvements of GM crops.

Keywords: epithelial healing, Lepidoptera, resistance, transcriptome

Procedia PDF Downloads 201
55 Introducing, Testing, and Evaluating a Unified JavaScript Framework for Professional Online Studies

Authors: Caspar Goeke, Holger Finger, Dorena Diekamp, Peter König

Abstract:

Online-based research has recently gained increasing attention from various fields of research in the cognitive sciences. Technological advances in the form of online crowdsourcing (Amazon Mechanical Turk), open data repositories (Open Science Framework), and online analysis (Ipython notebook) offer rich possibilities to improve, validate, and speed up research. However, until today there is no cross-platform integration of these subsystems. Furthermore, implementation of online studies still suffers from the complex implementation (server infrastructure, database programming, security considerations etc.). Here we propose and test a new JavaScript framework that enables researchers to conduct any kind of behavioral research in the browser without the need to program a single line of code. In particular our framework offers the possibility to manipulate and combine the experimental stimuli via a graphical editor, directly in the browser. Moreover, we included an action-event system that can be used to handle user interactions, interactively change stimuli properties or store participants’ responses. Besides traditional recordings such as reaction time, mouse and keyboard presses, the tool offers webcam based eye and face-tracking. On top of these features our framework also takes care about the participant recruitment, via crowdsourcing platforms such as Amazon Mechanical Turk. Furthermore, the build in functionality of google translate will ensure automatic text translations of the experimental content. Thereby, thousands of participants from different cultures and nationalities can be recruited literally within hours. Finally, the recorded data can be visualized and cleaned online, and then exported into the desired formats (csv, xls, sav, mat) for statistical analysis. Alternatively, the data can also be analyzed online within our framework using the integrated Ipython notebook. The framework was designed such that studies can be used interchangeably between researchers. This will support not only the idea of open data repositories but also constitutes the possibility to share and reuse the experimental designs and analyses such that the validity of the paradigms will be improved. Particularly, sharing and integrating the experimental designs and analysis will lead to an increased consistency of experimental paradigms. To demonstrate the functionality of the framework we present the results of a pilot study in the field of spatial navigation that was conducted using the framework. Specifically, we recruited over 2000 subjects with various cultural backgrounds and consequently analyzed performance difference in dependence on the factors culture, gender and age. Overall, our results demonstrate a strong influence of cultural factors in spatial cognition. Such an influence has not yet been reported before and would not have been possible to show without the massive amount of data collected via our framework. In fact, these findings shed new lights on cultural differences in spatial navigation. As a consequence we conclude that our new framework constitutes a wide range of advantages for online research and a methodological innovation, by which new insights can be revealed on the basis of massive data collection.

Keywords: cultural differences, crowdsourcing, JavaScript framework, methodological innovation, online data collection, online study, spatial cognition

Procedia PDF Downloads 257
54 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology

Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey

Abstract:

In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.

Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography

Procedia PDF Downloads 85
53 A Single Cell Omics Experiments as Tool for Benchmarking Bioinformatics Oncology Data Analysis Tools

Authors: Maddalena Arigoni, Maria Luisa Ratto, Raffaele A. Calogero, Luca Alessandri

Abstract:

The presence of tumor heterogeneity, where distinct cancer cells exhibit diverse morphological and phenotypic profiles, including gene expression, metabolism, and proliferation, poses challenges for molecular prognostic markers and patient classification for targeted therapies. Understanding the causes and progression of cancer requires research efforts aimed at characterizing heterogeneity, which can be facilitated by evolving single-cell sequencing technologies. However, analyzing single-cell data necessitates computational methods that often lack objective validation. Therefore, the establishment of benchmarking datasets is necessary to provide a controlled environment for validating bioinformatics tools in the field of single-cell oncology. Benchmarking bioinformatics tools for single-cell experiments can be costly due to the high expense involved. Therefore, datasets used for benchmarking are typically sourced from publicly available experiments, which often lack a comprehensive cell annotation. This limitation can affect the accuracy and effectiveness of such experiments as benchmarking tools. To address this issue, we introduce omics benchmark experiments designed to evaluate bioinformatics tools to depict the heterogeneity in single-cell tumor experiments. We conducted single-cell RNA sequencing on six lung cancer tumor cell lines that display resistant clones upon treatment of EGFR mutated tumors and are characterized by driver genes, namely ROS1, ALK, HER2, MET, KRAS, and BRAF. These driver genes are associated with downstream networks controlled by EGFR mutations, such as JAK-STAT, PI3K-AKT-mTOR, and MEK-ERK. The experiment also featured an EGFR-mutated cell line. Using 10XGenomics platform with cellplex technology, we analyzed the seven cell lines together with a pseudo-immunological microenvironment consisting of PBMC cells labeled with the Biolegend TotalSeq™-B Human Universal Cocktail (CITEseq). This technology allowed for independent labeling of each cell line and single-cell analysis of the pooled seven cell lines and the pseudo-microenvironment. The data generated from the aforementioned experiments are available as part of an online tool, which allows users to define cell heterogeneity and generates count tables as an output. The tool provides the cell line derivation for each cell and cell annotations for the pseudo-microenvironment based on CITEseq data by an experienced immunologist. Additionally, we created a range of pseudo-tumor tissues using different ratios of the aforementioned cells embedded in matrigel. These tissues were analyzed using 10XGenomics (FFPE samples) and Curio Bioscience (fresh frozen samples) platforms for spatial transcriptomics, further expanding the scope of our benchmark experiments. The benchmark experiments we conducted provide a unique opportunity to evaluate the performance of bioinformatics tools for detecting and characterizing tumor heterogeneity at the single-cell level. Overall, our experiments provide a controlled and standardized environment for assessing the accuracy and robustness of bioinformatics tools for studying tumor heterogeneity at the single-cell level, which can ultimately lead to more precise and effective cancer diagnosis and treatment.

Keywords: single cell omics, benchmark, spatial transcriptomics, CITEseq

Procedia PDF Downloads 117
52 Exploring Perspectives and Complexities of E-tutoring: Insights from Students Opting out of Online Tutor Service

Authors: Prince Chukwuneme Enwereji, Annelien Van Rooyen

Abstract:

In recent years, technology integration in education has transformed the learning landscape, particularly in online institutions. One technological advancement that has gained popularity is e-tutoring, which offers personalised academic support to students through online platforms. While e-tutoring has become well-known and has been adopted to promote collaborative learning, there are still students who do not use these services for various reasons. However, little attention has been given to understanding the perspectives of students who have not utilized these services. The research objectives include identifying the perceived benefits that non-e-tutoring students believe e-tutoring could offer, such as enhanced academic support, personalized learning experiences, and improved performance. Additionally, the study explored the potential drawbacks or concerns that non-e-tutoring students associate with e-tutoring, such as concerns about efficacy, a lack of face-to-face interaction, and platform accessibility. The study adopted a quantitative research approach with a descriptive design to gather and analyze data on non-e-tutoring students' perspectives. Online questionnaires were employed as the primary data collection method, allowing for the efficient collection of data from many participants. The collected data was analyzed using the Statistical Package for the Social Sciences (SPSS). Ethical concepts such as informed consent, anonymity of responses and protection of respondents against harm were maintained. Findings indicate that non-e-tutoring students perceive a sense of control over their own pace of learning, suggesting a preference for self-directed learning and the ability to tailor their educational experience to their individual needs and learning styles. They also exhibit high levels of motivation, believe in their ability to effectively participate in their studies and organize their academic work, and feel comfortable studying on their own without the help of e-tutors. However, non-e-tutoring students feel that e-tutors do not sufficiently address their academic needs and lack engagement. They also perceive a lack of clarity in the roles of e-tutors, leading to uncertainty about their responsibilities. In terms of communication, students feel overwhelmed by the volume of announcements and find repetitive information frustrating. Additionally, some students face challenges with their internet connection and associated cost, which can hinder their participation in online activities. Furthermore, non-e-tutoring students express a desire for interactions with their peers and a sense of belonging to a group or team. They value opportunities for collaboration, teamwork in their learning experience, the importance of fostering social interactions and creating a sense of community in online learning environments. This study recommended that students seek alternate support systems by reaching out to professors or academic advisors for guidance and clarification. Developing self-directed learning skills is essential, empowering students to take charge of their own learning through setting objectives, creating own study plans, and utilising resources. For HEIs, it was recommended that they should ensure that a variety of support services are available to cater to the needs of all students, including non-e-tutoring students. HEIs should also ensure easy access to online resources, promote a supportive community, and regularly evaluate and adapt their support techniques to meet students' changing requirements.

Keywords: online-tutor;, student support;, online education, educational practices, distance education

Procedia PDF Downloads 82
51 Small Scale Mobile Robot Auto-Parking Using Deep Learning, Image Processing, and Kinematics-Based Target Prediction

Authors: Mingxin Li, Liya Ni

Abstract:

Autonomous parking is a valuable feature applicable to many robotics applications such as tour guide robots, UV sanitizing robots, food delivery robots, and warehouse robots. With auto-parking, the robot will be able to park at the charging zone and charge itself without human intervention. As compared to self-driving vehicles, auto-parking is more challenging for a small-scale mobile robot only equipped with a front camera due to the camera view limited by the robot’s height and the narrow Field of View (FOV) of the inexpensive camera. In this research, auto-parking of a small-scale mobile robot with a front camera only was achieved in a four-step process: Firstly, transfer learning was performed on the AlexNet, a popular pre-trained convolutional neural network (CNN). It was trained with 150 pictures of empty parking slots and 150 pictures of occupied parking slots from the view angle of a small-scale robot. The dataset of images was divided into a group of 70% images for training and the remaining 30% images for validation. An average success rate of 95% was achieved. Secondly, the image of detected empty parking space was processed with edge detection followed by the computation of parametric representations of the boundary lines using the Hough Transform algorithm. Thirdly, the positions of the entrance point and center of available parking space were predicted based on the robot kinematic model as the robot was driving closer to the parking space because the boundary lines disappeared partially or completely from its camera view due to the height and FOV limitations. The robot used its wheel speeds to compute the positions of the parking space with respect to its changing local frame as it moved along, based on its kinematic model. Lastly, the predicted entrance point of the parking space was used as the reference for the motion control of the robot until it was replaced by the actual center when it became visible again by the robot. The linear and angular velocities of the robot chassis center were computed based on the error between the current chassis center and the reference point. Then the left and right wheel speeds were obtained using inverse kinematics and sent to the motor driver. The above-mentioned four subtasks were all successfully accomplished, with the transformed learning, image processing, and target prediction performed in MATLAB, while the motion control and image capture conducted on a self-built small scale differential drive mobile robot. The small-scale robot employs a Raspberry Pi board, a Pi camera, an L298N dual H-bridge motor driver, a USB power module, a power bank, four wheels, and a chassis. Future research includes three areas: the integration of all four subsystems into one hardware/software platform with the upgrade to an Nvidia Jetson Nano board that provides superior performance for deep learning and image processing; more testing and validation on the identification of available parking space and its boundary lines; improvement of performance after the hardware/software integration is completed.

Keywords: autonomous parking, convolutional neural network, image processing, kinematics-based prediction, transfer learning

Procedia PDF Downloads 132
50 An Elasto-Viscoplastic Constitutive Model for Unsaturated Soils: Numerical Implementation and Validation

Authors: Maria Lazari, Lorenzo Sanavia

Abstract:

Mechanics of unsaturated soils has been an active field of research in the last decades. Efficient constitutive models that take into account the partial saturation of soil are necessary to solve a number of engineering problems e.g. instability of slopes and cuts due to heavy rainfalls. A large number of constitutive models can now be found in the literature that considers fundamental issues associated with the unsaturated soil behaviour, like the volume change and shear strength behaviour with suction or saturation changes. Partially saturated soils may either expand or collapse upon wetting depending on the stress level, and it is also possible that a soil might experience a reversal in the volumetric behaviour during wetting. Shear strength of soils also changes dramatically with changes in the degree of saturation, and a related engineering problem is slope failures caused by rainfall. There are several states of the art reviews over the last years for studying the topic, usually providing a thorough discussion of the stress state, the advantages, and disadvantages of specific constitutive models as well as the latest developments in the area of unsaturated soil modelling. However, only a few studies focused on the coupling between partial saturation states and time effects on the behaviour of geomaterials. Rate dependency is experimentally observed in the mechanical response of granular materials, and a viscoplastic constitutive model is capable of reproducing creep and relaxation processes. Therefore, in this work an elasto-viscoplastic constitutive model for unsaturated soils is proposed and validated on the basis of experimental data. The model constitutes an extension of an existing elastoplastic strain-hardening constitutive model capable of capturing the behaviour of variably saturated soils, based on energy conjugated stress variables in the framework of superposed continua. The purpose was to develop a model able to deal with possible mechanical instabilities within a consistent energy framework. The model shares the same conceptual structure of the elastoplastic laws proposed to deal with bonded geomaterials subject to weathering or diagenesis and is capable of modelling several kinds of instabilities induced by the loss of hydraulic bonding contributions. The novelty of the proposed formulation is enhanced with the incorporation of density dependent stiffness and hardening coefficients in order to allow the modeling of the pycnotropy behaviour of granular materials with a single set of material constants. The model has been implemented in the commercial FE platform PLAXIS, widely used in Europe for advanced geotechnical design. The algorithmic strategies adopted for the stress-point algorithm had to be revised to take into account the different approach adopted by PLAXIS developers in the solution of the discrete non-linear equilibrium equations. An extensive comparison between models with a series of experimental data reported by different authors is presented to validate the model and illustrate the capability of the newly developed model. After the validation, the effectiveness of the viscoplastic model is displayed by numerical simulations of a partially saturated slope failure of the laboratory scale and the effect of viscosity and degree of saturation on slope’s stability is discussed.

Keywords: PLAXIS software, slope, unsaturated soils, Viscoplasticity

Procedia PDF Downloads 224
49 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 235
48 Mapping of Urban Micro-Climate in Lyon (France) by Integrating Complementary Predictors at Different Scales into Multiple Linear Regression Models

Authors: Lucille Alonso, Florent Renard

Abstract:

The characterizations of urban heat island (UHI) and their interactions with climate change and urban climates are the main research and public health issue, due to the increasing urbanization of the population. These solutions require a better knowledge of the UHI and micro-climate in urban areas, by combining measurements and modelling. This study is part of this topic by evaluating microclimatic conditions in dense urban areas in the Lyon Metropolitan Area (France) using a combination of data traditionally used such as topography, but also from LiDAR (Light Detection And Ranging) data, Landsat 8 satellite observation and Sentinel and ground measurements by bike. These bicycle-dependent weather data collections are used to build the database of the variable to be modelled, the air temperature, over Lyon’s hyper-center. This study aims to model the air temperature, measured during 6 mobile campaigns in Lyon in clear weather, using multiple linear regressions based on 33 explanatory variables. They are of various categories such as meteorological parameters from remote sensing, topographic variables, vegetation indices, the presence of water, humidity, bare soil, buildings, radiation, urban morphology or proximity and density to various land uses (water surfaces, vegetation, bare soil, etc.). The acquisition sources are multiple and come from the Landsat 8 and Sentinel satellites, LiDAR points, and cartographic products downloaded from an open data platform in Greater Lyon. Regarding the presence of low, medium, and high vegetation, the presence of buildings and ground, several buffers close to these factors were tested (5, 10, 20, 25, 50, 100, 200 and 500m). The buffers with the best linear correlations with air temperature for ground are 5m around the measurement points, for low and medium vegetation, and for building 50m and for high vegetation is 100m. The explanatory model of the dependent variable is obtained by multiple linear regression of the remaining explanatory variables (Pearson correlation matrix with a |r| < 0.7 and VIF with < 5) by integrating a stepwise sorting algorithm. Moreover, holdout cross-validation is performed, due to its ability to detect over-fitting of multiple regression, although multiple regression provides internal validation and randomization (80% training, 20% testing). Multiple linear regression explained, on average, 72% of the variance for the study days, with an average RMSE of only 0.20°C. The impact on the model of surface temperature in the estimation of air temperature is the most important variable. Other variables are recurrent such as distance to subway stations, distance to water areas, NDVI, digital elevation model, sky view factor, average vegetation density, or building density. Changing urban morphology influences the city's thermal patterns. The thermal atmosphere in dense urban areas can only be analysed on a microscale to be able to consider the local impact of trees, streets, and buildings. There is currently no network of fixed weather stations sufficiently deployed in central Lyon and most major urban areas. Therefore, it is necessary to use mobile measurements, followed by modelling to characterize the city's multiple thermal environments.

Keywords: air temperature, LIDAR, multiple linear regression, surface temperature, urban heat island

Procedia PDF Downloads 137
47 Intercultural Initiatives and Canadian Bilingualism

Authors: Muna Shafiq

Abstract:

Growth in international immigration is a reflection of increased migration patterns in Canada and in other parts of the world. Canada continues to promote itself as a bilingual country, yet the bilingual French and English population numbers do not reflect this platform. Each province’s integration policies focus only on second language learning of either English or French. Moreover, since English Canadians outnumber French Canadians, maintaining, much less increasing, English-French bilingualism appears unrealistic. One solution to increasing Canadian bilingualism requires creating intercultural communication initiatives between youth in Quebec and the rest of Canada. Specifically, the focus is on active, experiential learning, where intercultural competencies develop outside traditional classroom settings. The target groups are Generation Y Millennials and Generation Z Linksters, the next generations in the career and parenthood lines. Today, Canada’s education system, like many others, must continually renegotiate lines between programs it offers its immigrant and native communities. While some purists or right-wing nationalists would disagree, the survival of bilingualism in Canada has little to do with reducing immigration. Children and youth immigrants play a valuable role in increasing Canada’s French and English speaking communities. For instance, a focus on more immersion, over core French education programs for immigrant children and youth would not only increase bilingual rates; it would develop meaningful intercultural attachments between Canadians. Moreover, a vigilant increase of funding in French immersion programs is critical, as are new initiatives that focus on experiential language learning for students in French and English language programs. A favorable argument supports the premise that other than French-speaking students in Québec and elsewhere in Canada, second and third generation immigrant students are excellent ambassadors to promote bilingualism in Canada. Most already speak another language at home and understand the value of speaking more than one language in their adopted communities. Their dialogue and participation in experiential language exchange workshops are necessary. If the proposed exchanges take place inter-provincially, the momentum to increase collective regional voices increases. This regional collectivity can unite Canadians differently than nation-targeted initiatives. The results from an experiential youth exchange organized in 2017 between students at the crossroads of Generation Y and Generation Z in Vancouver and Quebec City respectively offer a promising starting point in assessing the strength of bringing together different regional voices to promote bilingualism. Code-switching between standard, international French Vancouver students, learn in the classroom versus more regional forms of Quebec French spoken locally created regional connectivity between students. The exchange was equally rewarding for both groups. Increasing their appreciation for each other’s regional differences allowed them to contribute actively to their social and emotional development. Within a sociolinguistic frame, this proposed model of experiential learning does not focus on hands-on work experience. However, the benefits of such exchanges are as valuable as work experience initiatives developed in experiential education. Students who actively code switch between French and English in real, not simulated contexts appreciate bilingualism more meaningfully and experience its value in concrete terms.

Keywords: experiential learning, intercultural communication, social and emotional learning, sociolinguistic code-switching

Procedia PDF Downloads 138
46 Content Analysis of Gucci’s ‘Blackface’ Sweater Controversy across Multiple Media Platforms

Authors: John Mark King

Abstract:

Beginning on Feb. 7, 2019, the luxury brand, Gucci, was met with a firestorm on social media over fashion runway images of its black balaclava sweater, which covered the bottom half of the face and featured large, shiny bright red lips surrounding the mouth cutout. Many observers on social media and in the news media noted the garment resembled racist “blackface.” This study aimed to measure how items were framed across multiple media platforms. The unit of analysis was any headline or lead paragraph published using the search terms “Gucci” and “sweater” or “jumper” or “balaclava” during the one-year timeframe of Feb. 7, 2019, to Feb. 6, 2020. Limitations included headlines and lead paragraphs published in English and indexed in the Lexis/Nexis database. Independent variables were the nation in which the item was published and the platform (newspapers, blogs, web-based publications, newswires, magazines, or broadcast news). Dependent variables were tone toward Gucci (negative, neutral or positive) and frame (blackface/racism/racist, boycott/celebrity boycott, sweater/balaclava/jumper/fashion, apology/pulling the product/diversity initiatives by Gucci or frames unrelated to the controversy but still involving Gucci sweaters) and word count. Two coders achieved 100% agreement on all variables except tone (94.2%) and frame (96.3%). The search yielded 276 items published from 155 sources in 18 nations. The tone toward Gucci during this period was negative (69.9%). Items that were neutral (16.3%) or positive (13.8%) toward the brand were overwhelmingly related to items about other Gucci sweaters worn by celebrities or fashion reviews of other Gucci sweaters. The most frequent frame was apology/pulling the product/diversity initiatives by Gucci (35.5%). The tone was most frequently negative across all continents, including the Middle East (83.3% negative), Asia (81.8%), North America (76.6%), Australia/New Zealand (66.7%), and Europe (59.8%). Newspapers/magazines/newswires/broadcast news transcripts (72.4%) were more negative than blogs/web-based publications (63.6%). The most frequent frames used by newspapers/magazines/newswires/broadcast news transcripts were apology/pulling the product/diversity initiatives by Gucci (38.7%) and blackface/racism/racist (26.1%). Blogs/web-based publications most frequently used frames unrelated to the controversial garment, but about other Gucci sweaters (42.9%) and apology/pulling the product/diversity initiatives by Gucci (27.3%). Sources in Western nations (34.7%) and Eastern nations (47.1%) most frequently used the frame of apology/pulling the product/diversity initiatives by Gucci. Mean word count was higher for negative items (583.58) than positive items (404.76). Items framed as blackface/racism/racist or boycott/celebrity boycott had higher mean word count (668.97) than items framed as sweater/balaclava/jumper/fashion or apology/pulling the product/diversity initiatives by Gucci (498.22). The author concluded that during the year-long period, Gucci’s image was likely damaged by the release of the garment at the center of the controversy due to near-universally negative items published, but Gucci’s apology/pulling the product off the market/diversity initiatives by Gucci and items about other Gucci sweaters worn by celebrities or fashion reviews of other Gucci sweaters were the most common frames across multiple media platforms, which may have mitigated the damage to the brand.

Keywords: Blackface, branding, Gucci, media framing

Procedia PDF Downloads 148
45 Isolation of Bacterial Species with Potential Capacity for Siloxane Removal in Biogas Upgrading

Authors: Ellana Boada, Eric Santos-Clotas, Alba Cabrera-Codony, Maria Martin, Lluis Baneras, Frederic Gich

Abstract:

Volatile methylsiloxanes (VMS) are a group of manmade silicone compounds widely used in household and industrial applications that end up on the biogas produced through the anaerobic digestion of organic matter in landfills and wastewater treatment plants. The presence of VMS during the biogas energy conversion can cause damage on the engines, reducing the efficiency of this renewable energy source. Non regenerative adsorption onto activated carbon is the most widely used technology to remove siloxanes from biogas, while new trends point out that biotechnology offers a low-cost and environmentally friendly alternative to conventional technologies. The first objective of this research was to enrich, isolate and identify bacterial species able to grow using siloxane molecules as a sole carbon source: anoxic wastewater sludge was used as initial inoculum in liquid anoxic enrichments, adding D4 (as representative siloxane compound) previously adsorbed on activated carbon. After several months of acclimatization, liquid enrichments were plated onto solid media containing D4 and thirty-four bacterial isolates were obtained. 16S rRNA gene sequencing allowed the identification of strains belonging to the following species: Ciceribacter lividus, Alicycliphilus denitrificans, Pseudomonas aeruginosa and Pseudomonas citronellolis which are described to be capable to degrade toxic volatile organic compounds. Kinetic assays with 8 representative strains revealed higher cell growth in the presence of D4 compared to the control. Our second objective was to characterize the community composition and diversity of the microbial community present in the enrichments and to elucidate whether the isolated strains were representative members of the community or not. DNA samples were extracted, the 16S rRNA gene was amplified (515F & 806R primer pair), and the microbiome analyzed from sequences obtained with a MiSeq PE250 platform. Results showed that the retrieved isolates only represented a minor fraction of the microorganisms present in the enrichment samples, which were represented by Alpha, Beta, and Gamma proteobacteria as dominant groups in the category class thus suggesting that other microbial species and/or consortia may be important for D4 biodegradation. These results highlight the need of additional protocols for the isolation of relevant D4 degraders. Currently, we are developing molecular tools targeting key genes involved in siloxane biodegradation to identify and quantify the capacity of the isolates to metabolize D4 in batch cultures supplied with a synthetic gas stream of air containing 60 mg m⁻³ of D4 together with other volatile organic compounds found in the biogas mixture (i.e. toluene, hexane and limonene). The isolates were used as inoculum in a biotrickling filter containing lava rocks and activated carbon to assess their capacity for siloxane removal. Preliminary results of biotrickling filter performance showed 35% of siloxane biodegradation in a contact time of 14 minutes, denoting that biological siloxane removal is a promising technology for biogas upgrading.

Keywords: bacterial cultivation, biogas upgrading, microbiome, siloxanes

Procedia PDF Downloads 258
44 Guard@Lis: Birdwatching Augmented Reality Mobile Application

Authors: Jose A. C. Venancio, Alexandrino J. M. Goncalves, Anabela Marto, Nuno C. S. Rodrigues, Rita M. T. Ascenso

Abstract:

Nowadays, it is common to find people who are concerned about getting away from the everyday life routine, looking forward to outcome well-being and pleasant emotions. Trying to disconnect themselves from the usual places of work and residence, they pursue different places, such as tourist destinations, aiming to have unexpected experiences. In order to make this exploration process easier, cities and tourism agencies seek new opportunities and solutions, creating routes with diverse cultural landmarks, including natural landscapes and historic buildings. These offers frequently aspire to the preservation of the local patrimony. In nature and wildlife, birdwatching is an activity that has been increasing, both in cities and in the countryside. This activity seeks to find, observe and identify the diversity of birds that live permanently or temporarily in these places, and it is usually supported by birdwatching guides. Leiria (Portugal) is a well-known city, presenting several historical and natural landmarks, like the Lis river and the castle where King D. Dinis lived in the 13th century. Along the Lis River, a conservation process was carried out and a pedestrian route was created (Polis project). This is considered an excellent spot for birdwatching, especially for the gray heron (Ardea cinerea) and for the kingfisher (Alcedo atthis). There is also a route through the city, from the riverside to the castle, which encloses a characterized variety of species, such as the barn swallow (Hirundo rustica), known for passing through different seasons of the year. Birdwatching is sometimes a difficult task since it is not always possible to see all bird species that inhabit a given place. For this reason, a need to create a technological solution was found to ease this activity. This project aims to encourage people to learn about the various species of birds that live along the Lis River and to promote the preservation of nature in a conscious way. This work is being conducted in collaboration with Leiria Municipal Council and with the Environmental Interpretation Centre. It intends to show the majesty of the Lis River, a place visited daily by several people, such as children and families, who use it for didactic and recreational activities. We are developing a mobile multi-platform application (Guard@Lis) that allows bird species to be observed along a given route, using representative digital 3D models through the integration of augmented reality technologies. Guard@Lis displays a route with points of interest for birdwatching and a list of species for each point of interest, along with scientific information, images and sounds for every species. For some birds, to ensure their observation, the user can watch them in loco, in their real and natural environment, with their mobile device by means of augmented reality, giving the sensation of presence of these birds, even if they cannot be seen in that place at that moment. The augmented reality feature is being developed with Vuforia SDK, using a hybrid approach to recognition and tracking processes, combining marks and geolocation techniques. This application proposes routes and notifies users with alerts for the possibility of viewing models of augmented reality birds. The final Guard@Lis prototype will be tested by volunteers in-situ.

Keywords: augmented reality, birdwatching route, mobile application, nature tourism, watch birds using augmented reality

Procedia PDF Downloads 177
43 Integration of Building Information Modeling Framework for 4D Constructability Review and Clash Detection Management of a Sewage Treatment Plant

Authors: Malla Vijayeta, Y. Vijaya Kumar, N. Ramakrishna Raju, K. Satyanarayana

Abstract:

Global AEC (architecture, engineering, and construction) industry has been coined as one of the most resistive domains in embracing technology. Although this digital era has been inundated with software tools like CAD, STADD, CANDY, Microsoft Project, Primavera etc. the key stakeholders have been working in siloes and processes remain fragmented. Unlike the yesteryears’ simpler project delivery methods, the current projects are of fast-track, complex, risky, multidisciplinary, stakeholder’s influential, statutorily regulative etc. pose extensive bottlenecks in preventing timely completion of projects. At this juncture, a paradigm shift surfaced in construction industry, and Building Information Modeling, aka BIM, has been a panacea to bolster the multidisciplinary teams’ cooperative and collaborative work leading to productive, sustainable and leaner project outcome. Building information modeling has been integrative, stakeholder engaging and centralized approach in providing a common platform of communication. A common misconception that BIM can be used for building/high rise projects in Indian Construction Industry, while this paper discusses of the implementation of BIM processes/methodologies in water and waste water industry. It elucidates about BIM 4D planning and constructability reviews of a Sewage Treatment Plant in India. Conventional construction planning and logistics management involves a blend of experience coupled with imagination. Even though the excerpts or judgments or lessons learnt gained from veterans might be predictive and helpful, but the uncertainty factor persists. This paper shall delve about the case study of real time implementation of BIM 4D planning protocols for one of the Sewage Treatment Plant of Dravyavati River Rejuvenation Project in India and develops a Time Liner to identify logistics planning and clash detection. With this BIM processes, we shall find that there will be significant reduction of duplication of tasks and reworks. Also another benefit achieved will be better visualization and workarounds during conception stage and enables for early involvement of the stakeholders in the Project Life cycle of Sewage Treatment Plant construction. Moreover, we have also taken an opinion poll of the benefits accrued utilizing BIM processes versus traditional paper based communication like 2D and 3D CAD tools. Thus this paper concludes with BIM framework for Sewage Treatment Plant construction which will achieve optimal construction co-ordination advantages like 4D construction sequencing, interference checking, clash detection checking and resolutions by primary engagement of all key stakeholders thereby identifying potential risks and subsequent creation of risk response strategies. However, certain hiccups like hesitancy in adoption of BIM technology by naïve users and availability of proficient BIM trainers in India poses a phenomenal impediment. Hence the nurture of BIM processes from conception, construction and till commissioning, operation and maintenance along with deconstruction of a project’s life cycle is highly essential for Indian Construction Industry in this digital era.

Keywords: integrated BIM workflow, 4D planning with BIM, building information modeling, clash detection and visualization, constructability reviews, project life cycle

Procedia PDF Downloads 122
42 Near-Peer Mentoring/Curriculum and Community Enterprise for Environmental Restoration Science

Authors: Lauren B. Birney

Abstract:

The BOP-CCERS (Billion Oyster Project- Curriculum and Community Enterprise for Restoration Science) Near-Peer Mentoring Program provides the long-term (five-year) support network to motivate and guide students toward restoration science-based CTE pathways. Students are selected from middle schools with actively participating BOP-CCERS teachers. Teachers will nominate students from grades 6-8 to join cohorts of between 10 and 15 students each. Cohorts are comprised primarily of students from the same school in order to facilitate mentors' travel logistics as well as to sustain connections with students and their families. Each cohort is matched with an exceptional undergraduate or graduate student, either a BOP research associate or STEM mentor recruited from collaborating City University of New York (CUNY) partner programs. In rare cases, an exceptional high school junior or senior may be matched with a cohort in addition to a research associate or graduate student. In no case is a high school student or minor be placed individually with a cohort. Mentors meet with students at least once per month and provide at least one offsite field visit per month, either to a local STEM Hub or research lab. Keeping with its five-year trajectory, the near-peer mentoring program will seek to retain students in the same cohort with the same mentor for the full duration of middle school and for at least two additional years of high school. Upon reaching the final quarter of 8th grade, the mentor will develop a meeting plan for each individual mentee. The mentee and the mentor will be required to meet individually or in small groups once per month. Once per quarter, individual meetings will be substituted for full cohort professional outings. The mentor will organize the entire cohort on a field visit or educational workshop with a museum or aquarium partner. In addition to the mentor-mentee relationship, each participating student will also be asked to conduct and present his or her own BOP field research. This research is ideally carried out with the support of the students’ regular high school STEM subject teacher; however, in cases where the teacher or school does not permit independent study, the student will be asked to conduct the research on an extracurricular basis. Near-peer mentoring affects students’ social identities and helps them to connect to role models from similar groups, ultimately giving them a sense of belonging. Qualitative and quantitative analytics were performed throughout the study. Interviews and focus groups also ensued. Additionally, an external evaluator was utilized to ensure project efficacy, efficiency, and effectiveness throughout the entire project. The BOP-CCERS Near Peer Mentoring program is a peer support network in which high school students with interest or experience in BOP (Billion Oyster Project) topics and activities (such as classroom oyster tanks, STEM Hubs, or digital platform research) provide mentorship and support for middle school or high school freshmen mentees. Peer mentoring not only empowers those students being taught but also increases the content knowledge and engagement of mentors. This support provides the necessary resources, structure, and tools to assist students in finding success.

Keywords: STEM education, environmental science, citizen science, near peer mentoring

Procedia PDF Downloads 91
41 Probability Modeling and Genetic Algorithms in Small Wind Turbine Design Optimization: Mentored Interdisciplinary Undergraduate Research at LaGuardia Community College

Authors: Marina Nechayeva, Malgorzata Marciniak, Vladimir Przhebelskiy, A. Dragutan, S. Lamichhane, S. Oikawa

Abstract:

This presentation is a progress report on a faculty-student research collaboration at CUNY LaGuardia Community College (LaGCC) aimed at designing a small horizontal axis wind turbine optimized for the wind patterns on the roof of our campus. Our project combines statistical and engineering research. Our wind modeling protocol is based upon a recent wind study by a faculty-student research group at MIT, and some of our blade design methods are adopted from a senior engineering project at CUNY City College. Our use of genetic algorithms has been inspired by the work on small wind turbines’ design by David Wood. We combine these diverse approaches in our interdisciplinary project in a way that has not been done before and improve upon certain techniques used by our predecessors. We employ several estimation methods to determine the best fitting parametric probability distribution model for the local wind speed data obtained through correlating short-term on-site measurements with a long-term time series at the nearby airport. The model serves as a foundation for engineering research that focuses on adapting and implementing genetic algorithms (GAs) to engineering optimization of the wind turbine design using Blade Element Momentum Theory. GAs are used to create new airfoils with desirable aerodynamic specifications. Small scale models of best performing designs are 3D printed and tested in the wind tunnel to verify the accuracy of relevant calculations. Genetic algorithms are applied to selected airfoils to determine the blade design (radial cord and pitch distribution) that would optimize the coefficient of power profile of the turbine. Our approach improves upon the traditional blade design methods in that it lets us dispense with assumptions necessary to simplify the system of Blade Element Momentum Theory equations, thus resulting in more accurate aerodynamic performance calculations. Furthermore, it enables us to design blades optimized for a whole range of wind speeds rather than a single value. Lastly, we improve upon known GA-based methods in that our algorithms are constructed to work with XFoil generated airfoils data which enables us to optimize blades using our own high glide ratio airfoil designs, without having to rely upon available empirical data from existing airfoils, such as NACA series. Beyond its immediate goal, this ongoing project serves as a training and selection platform for CUNY Research Scholars Program (CRSP) through its annual Aerodynamics and Wind Energy Research Seminar (AWERS), an undergraduate summer research boot camp, designed to introduce prospective researchers to the relevant theoretical background and methodology, get them up to speed with the current state of our research, and test their abilities and commitment to the program. Furthermore, several aspects of the research (e.g., writing code for 3D printing of airfoils) are adapted in the form of classroom research activities to enhance Calculus sequence instruction at LaGCC.

Keywords: engineering design optimization, genetic algorithms, horizontal axis wind turbine, wind modeling

Procedia PDF Downloads 231