Search results for: robust M-estimator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1475

Search results for: robust M-estimator

635 Barriers to Public Innovation in Colombia: Case Study in Central Administrative Region

Authors: Yessenia Parrado, Ana Barbosa, Daniela Mahe, Sebastian Toro, Jhon Garcia

Abstract:

Public innovation has gained strength in recent years in response to the need to find new strategies or mechanisms to interact between government entities and citizens. In this way, the Colombian government has been promoting policies aimed at strengthening innovation as a fundamental aspect in the work of public entities. However, in order to potentiate the capacities of public servants and therefore of the institutions and organizations to which they belong, it is necessary to be able to understand the context under which they operate in their daily work. This article aims to compile the work developed by the laboratory of innovation, creativity, and new technologies LAB101 of the National University of Colombia for the National Department of Planning. A case study was developed in the central region of Colombia made up of five departments, through the construction of instruments based on quantitative techniques in response to the item combined with qualitative analysis through semi-structured interviews to understand the perception of possible barriers to innovation and the obstacles that have prevented the acceleration of transformation within public organizations. From the information collected, different analyzes are carried out that allows a more robust explanation to be given to the results obtained, and a set of categories are established to group different characteristics associated with possible difficulties that officials perceive to innovate and that are later conceived as barriers. Finally, a proposal for an indicator was built to measure the degree of innovation within public entities in order to be able to carry a metric in future opportunities. The main findings of this study show three key components to be strengthened in public entities and organizations: governance, knowledge management, and the promotion of collaborative workspaces.

Keywords: barriers, enablers, management, public innovation

Procedia PDF Downloads 117
634 Salvage Reconstruction of Intraoral Dehiscence following Free Fibular Flap with a Superficial Temporal Artery Islandized Flap (STAIF)

Authors: Allyne Topaz

Abstract:

Intraoral dehiscence compromises free fibula flaps following mandibular reconstruction. Salivary contamination risks thrombosis of microvascular anastomosis and hardware infection. The superficial temporal artery islandized flap (STAIF) offers an efficient, non-microsurgical reconstructive option for regaining intraoral competency for a time sensitive complication. Methods: The STAIF flap is based on the superficial temporal artery coursing along the anterior hairline. The flap is mapped with assistance of the doppler probe. The width of the skin paddle is taken based on the ability to close the donor site. The flap is taken down to the level of the zygomatic arch and tunneled into the mouth. Results: We present a case of a patient who underwent mandibular reconstruction with a free fibula flap after a traumatic shotgun wound. The patient developed repeated intraoral dehiscence following failed local buccal and floor of mouth flaps leading to salivary contamination of the flap and hardware. The intraoral dehiscence was successfully salvaged on the third attempt with a STAIF flap. Conclusions: Intraoral dehiscence creates a complication requiring urgent attention to prevent loss of free fibula flap after mandibular reconstruction. The STAIF is a non-microsurgical option for restoring intraoral competency. This robust, axially vascularized skin paddle may be split for intra- and extra-oral coverage, as needed and can be an important tool in the reconstructive armamentarium.

Keywords: free fibula flap, intraoral dehiscence, mandibular reconstruction, superficial temporal artery islandized flap

Procedia PDF Downloads 135
633 Advanced Metallic Frameworks for Development of Robust and Efficient Water Splitting Electrodes

Authors: Tam D. Nguyen, Joe Varga, Douglas MacFarlane, Alexandr Simonov

Abstract:

Development of advanced technologies for green hydrogen generation from renewables is of key strategic importance to global future energy security and economic growth. Renewable-powered water electrolysis (WE) is considered as the most effective of the sustainable methods for hydrogen generation at scale. Currently, the greatest challenge of hydrogen production via water electrolysis is the insufficiently high efficiency. In which, the energy loss associated with the conversion of water to hydrogen is approximately 40-60%, with 30-35% associated with the electrolysis itself and 10-12% with gas compression and transportation. Hence, development of an energy-efficient water electrolyser that can generate hydrogen at high pressure will address both of these major challenges. This requires the development of advanced electrode configuration of the water electrolysis cell. Herein, we developed a highly-ordered interconnected structure of the metallic inverse-opal (IO) frameworks based on low cost materials, e.g. Cu, Ni, Fe, Co. The water electrolysis electrodes based on these frameworks can provide excellent mechanical strength required for the application under conditions of extreme pressure, as well as outstanding catalytic performance through the exceptional high surface area and high electrical conductivity. For example, NiFe layered double hydroxide (LDH) catalyst deposited on Cu IO is able to reach the oxygen evolution reaction (OER) catalytic performance up to the rates of > 100 mA cm−2 (>727A gcatalyst-1) at an overpotential of ~0.3 V. This high performance is achieved with only few micron-thick catalyst layers, in contrast to similarly performance of 103-fold thicker electrodes based on foams and other substrates.

Keywords: oxygen evolution reaction, support materials, mass transport, NiFe LDH

Procedia PDF Downloads 5
632 The Effect of Technology- facilitated Lesson Study toward Teacher’s Computer Assisted Language Learning Competencies

Authors: Yi-Ning Chang

Abstract:

With the rapid advancement of technology, it has become crucial for educators to adeptly integrate technology into their teaching and develop a robust Computer-Assisted Language Learning (CALL) competency. Addressing this need, the present study adopted a technology-based Lesson Study approach to assess its impact on the CALL competency and professional capabilities of EFL teachers. Additionally, the study delved into teachers' perceptions of the benefits derived from participating in the creation of technologically integrated lesson plans. The iterative process of technology-based Lesson Study facilitated ample peer discussion, enabling teachers to flexibly design and implement lesson plans that incorporate various technological tools. This 15-week study included 10 in- service teachers from a university of science and technology in the central of Taiwan. The collected data included pre- and post- lesson planning scores, pre- and post- TPACK survey scores, classroom observation forms, designed lesson plans, and reflective essays. The pre- and post- lesson planning and TPACK survey scores were analyzed employing a pair-sampled t test; students’ reflective essays were respectively analyzed applying content analysis. The findings revealed that the teachers’ lesson planning ability and CALL competencies were improved. Teachers perceived a better understanding of integrating technology with teaching subjects, more effective teaching skills, and a deeper understanding of technology. Pedagogical implications and future studies are also discussed.

Keywords: CALL, language learning, lesson study, lesson plan

Procedia PDF Downloads 42
631 Critical Success Factors Quality Requirement Change Management

Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan

Abstract:

Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.

Keywords: global software development, requirement engineering, systematic literature review, success factors

Procedia PDF Downloads 197
630 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis

Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha

Abstract:

Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.

Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier

Procedia PDF Downloads 467
629 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids

Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao

Abstract:

An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.

Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.

Procedia PDF Downloads 150
628 Exploration of a Blockchain Assisted Framework for Through Baggage Interlining: Blocklining

Authors: Mary Rose Everan, Michael McCann, Gary Cullen

Abstract:

International travel journeys, by their nature, incorporate elements provided by multiple service providers such as airlines, rail carriers, airports, and ground handlers. Data needs to be stored by and exchanged between these parties in the process of managing the journey. The fragmented nature of this shared management of mutual clients is a limiting factor in the development of a seamless, hassle-free, end-to-end travel experience. Traditional interlining agreements attempt to facilitate many separate aspects of co-operation between service providers, typically between airlines and, to some extent, intermodal travel operators, including schedules, fares, ticketing, through check-in, and baggage handling. These arrangements rely on pre-agreement. The development of Virtual Interlining - that is, interlining facilitated by a third party (often but not always an airport) without formal pre-agreement by the airlines or rail carriers - demonstrates an underlying demand for a better quality end-to-end travel experience. Blockchain solutions are being explored in a number of industries and offer, at first sight, an immutable, single source of truth for this data, avoiding data conflicts and misinterpretation. Combined with Smart Contracts, they seemingly offer a more robust and dynamic platform for multi-stakeholder ventures, and even perhaps the ability to join and leave consortia dynamically. Applying blockchain to the intermodal interlining space – termed Blocklining in this paper - is complex and multi-faceted because of the many aspects of cooperation outlined above. To explore its potential, this paper concentrates on one particular dimension, that of through baggage interlining.

Keywords: aviation, baggage, blocklining, intermodal, interlining

Procedia PDF Downloads 147
627 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 364
626 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform

Procedia PDF Downloads 228
625 Development of a Consult Liaison Psychology Service: A Systematic Review

Authors: Ben J. Lippe

Abstract:

Consult Liaison Psychology services are overgrowing, given the robust empirical support of the utility of this service in hospital settings. These psychological services, including clinical assessment, applied psychotherapy, and consultation with other healthcare providers, have been shown to improve health outcomes for patients and bolster important areas of administrative interest such as decreased length of patient admission. However, there is little descriptive literature outlining the process and mechanisms of building or developing a Consult Liaison Psychology service. The main findings of this current conceptual work are intended to be clear in nature to elucidate the essential methods involved in developing consult liaison psychology programs, including thorough reviews of relevant behavioral health literature and inclusion of experiential outcomes. The diverse range of hospital settings and healthcare systems makes a “blueprint” method of program development challenging to define, yet important structural frameworks presented here based on the relevant literature and applied practice can help lay critical groundwork for program development in this growing area of psychological service. This conceptual approach addresses the prominent processes, as well as common programmatic and clinical pitfalls, involved in the event of a Consult Liaison Psychology service. This paper, including a systematic review of relevant literature, is intended to serve as a key program development reference for the development of Consult Liaison Psychology services, other related behavioral health programs, and to help inform further research efforts.

Keywords: behavioral health, consult liaison, health psychology, psychology program development

Procedia PDF Downloads 159
624 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater

Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj

Abstract:

In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.

Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation

Procedia PDF Downloads 72
623 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method

Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga

Abstract:

Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.

Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses

Procedia PDF Downloads 262
622 Integrating Optuna and Synthetic Data Generation for Optimized Medical Transcript Classification Using BioBERT

Authors: Sachi Nandan Mohanty, Shreya Sinha, Sweeti Sah, Shweta Sharma4

Abstract:

The advancement of natural language processing has majorly influenced the field of medical transcript classification, providing a robust framework for enhancing the accuracy of clinical data processing. It has enormous potential to transform healthcare and improve people's livelihoods. This research focuses on improving the accuracy of medical transcript categorization using Bidirectional Encoder Representations from Transformers (BERT) and its specialized variants, including BioBERT, ClinicalBERT, SciBERT, and BlueBERT. The experimental work employs Optuna, an optimization framework, for hyperparameter tuning to identify the most effective variant, concluding that BioBERT yields the best performance. Furthermore, various optimizers, including Adam, RMSprop, and Layerwise adaptive large batch optimization (LAMB), were evaluated alongside BERT's default AdamW optimizer. The findings show that the LAMB optimizer achieves a performance that is equally good as AdamW's. Synthetic data generation techniques from Gretel were utilized to augment the dataset, expanding the original dataset from 5,000 to 10,000 rows. Subsequent evaluations demonstrated that the model maintained its performance with synthetic data, with the LAMB optimizer showing marginally better results. The enhanced dataset and optimized model configurations improved classification accuracy, showcasing the efficacy of the BioBERT variant and the LAMB optimizer. It resulted in an accuracy of up to 98.2% and 90.8% for the original and combined datasets.

Keywords: BioBERT, clinical data, healthcare AI, transformer models

Procedia PDF Downloads 4
621 Artificial Intelligence in Bioscience: The Next Frontier

Authors: Parthiban Srinivasan

Abstract:

With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.

Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction

Procedia PDF Downloads 360
620 Modeling Breathable Particulate Matter Concentrations over Mexico City Retrieved from Landsat 8 Satellite Imagery

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Magnolia G. Martinez-Rivera, Pablo de J. Angeles-Salto, Carlos Herrera-Ventosa

Abstract:

In order to diminish health risks, it is of major importance to monitor air quality. However, this process is accompanied by the high costs of physical and human resources. In this context, this research is carried out with the main objective of developing a predictive model for concentrations of inhalable particles (PM10-2.5) using remote sensing. To develop the model, satellite images, mainly from Landsat 8, of the Mexico City’s Metropolitan Area were used. Using historical PM10 and PM2.5 measurements of the RAMA (Automatic Environmental Monitoring Network of Mexico City) and through the processing of the available satellite images, a preliminary model was generated in which it was possible to observe critical opportunity areas that will allow the generation of a robust model. Through the preliminary model applied to the scenes of Mexico City, three areas were identified that cause great interest due to the presumed high concentration of PM; the zones are those that present high plant density, bodies of water and soil without constructions or vegetation. To date, work continues on this line to improve the preliminary model that has been proposed. In addition, a brief analysis was made of six models, presented in articles developed in different parts of the world, this in order to visualize the optimal bands for the generation of a suitable model for Mexico City. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air quality, modeling pollution, particulate matter, remote sensing

Procedia PDF Downloads 156
619 Entrepreneurship Education and Student Entrepreneurial Intention: A Comprehensive Review, Synthesis of Empirical Findings, and Strategic Insights for Future Research Advancements

Authors: Abdul Waris Jalili, Yanqing Wang, Som Suor

Abstract:

This research paper explores the relationship between entrepreneurship education and students' entrepreneurial intentions. It aims to determine if entrepreneurship education reliably predicts students' intention to become entrepreneurs and how and when this relationship occurs. This study aims to investigate the predictive relationship between entrepreneurship education and student entrepreneurial intentions. The goal is to understand the factors that influence this relationship and to identify any mediating or moderating factors. A thorough and systematic search and review of empirical articles published between 2013 and 2023 were conducted. Three databases, Google Scholar, Science Direct, and PubMed, were explored to gather relevant studies. Criteria such as reporting empirical results, publication in English, and addressing the research questions were used to select 35 papers for analysis. The collective findings of the reviewed studies suggest a generally positive relationship between entrepreneurship education and student entrepreneurial intentions. However, recent findings indicate that this relationship may be more complex than previously thought. Mediators and moderators have been identified, highlighting instances where entrepreneurship education indirectly influences student entrepreneurial intentions. The review also emphasizes the need for more robust research designs to establish causality in this field. This research adds to the existing literature by providing a comprehensive review of the relationship between entrepreneurship education and student entrepreneurial intentions. It highlights the complexity of this relationship and the importance of considering mediators and moderators. The study also calls for future research to explore different facets of entrepreneurship education independently and examine complex relationships more comprehensively.

Keywords: entrepreneurship, entrepreneurship education, entrepreneurial intention, entrepreneurial self-efficacy

Procedia PDF Downloads 68
618 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 185
617 Development of a Web-Based Application for Intelligent Fertilizer Management in Rice Cultivation

Authors: Hao-Wei Fu, Chung-Feng Kao

Abstract:

In the era of rapid technological advancement, information technology (IT) has become integral to modern life, exerting significant influence across diverse sectors and serving as a catalyst for development in various industries. Within agriculture, the integration of IT offers substantial benefits, notably enhancing operational efficiency. Real-time monitoring systems, for instance, have been widely embraced in agriculture, effectively improving crop management practices. This study specifically addresses the management of rice panicle fertilizer, presenting the development of a web application tailored to handle data associated with rice panicle fertilizer management. Leveraging the normalized difference red edge index, this application optimizes the quantity of rice panicle fertilizer used, providing recommendations to agricultural stakeholders and service providers in the agricultural information sector. The overarching objective is to minimize costs while maximizing yields. Furthermore, a robust database system has been established to store and manage relevant data for future reference in rice cultivation management. Additionally, the study utilizes the Representational State Transfer software architectural style to construct an application programming interface (API), facilitating data creation, retrieval, updating, and deletion for users via the HyperText Transfer Protocol methods. Future plans involve integrating this API with third-party services to incorporate it into larger frameworks, thus catering to the diverse requirements of various third-party services.

Keywords: application programming interface, HyperText Transfer Protocol, nitrogen fertilizer intelligent management, web-based application

Procedia PDF Downloads 62
616 Grammarly: Great Writings Get Work Done Using AI

Authors: Neha Intikhab Khan, Alanoud AlBalwi, Farah Alqazlan, Tala Almadoudi

Abstract:

Background: Grammarly, a widely utilized writing assistant launched in 2009, leverages advanced artificial intelligence and natural language processing to enhance writing quality across various platforms. Methods: To collect data on user perceptions of Grammarly, a structured survey was designed and distributed via Google Forms. The survey included a series of quantitative and qualitative questions aimed at assessing various aspects of Grammarly's performance. The survey comprised multiple-choice questions, Likert scale items (ranging from "strongly disagree" to "strongly agree"), and open-ended questions to capture detailed user feedback. The target population included students, friends, and family members. The collected responses were analyzed using statistical methods to quantify user satisfaction. Participation in the survey was voluntary, and respondents were assured anonymity and confidentiality. Results: The survey of 28 respondents revealed a generally favorable perception of Grammarly's AI capabilities. A significant 39.3% strongly agreed that it effectively improves text tone, with an additional 46.4% agreeing, while 10.7% remained neutral. For clarity suggestions, 28.6% strongly agreed, and 57.1% agreed, totaling 85.7% recognition of its value. Regarding grammatical accuracy across various genres, 46.4% rated it a perfect score of 5, contributing to 78.5% who found it highly effective. Conclusion: The evolution of Grammarly from a basic grammar checker to a robust AI-driven application underscores its adaptability and commitment to helping users develop their writing skills.

Keywords: Grammarly, writing tool, user engagement, AI capabilities, effectiveness

Procedia PDF Downloads 5
615 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review

Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie

Abstract:

The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.

Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield

Procedia PDF Downloads 60
614 Role of Kerala’s Diaspora Philanthropy Engagement During Economic Crises

Authors: Shibinu S, Mohamed Haseeb N

Abstract:

In times of crisis, the diaspora's role and the help it offers are seen to be vital in determining how many countries, particularly low- and middle-income nations that significantly rely on remittances, recover. Twenty-one lakh twenty thousand Keralites have emigrated abroad, with 81.2 percent of these outflows occurring in the Gulf Cooperative Council (GCC). Most of them are semi-skilled or low-skilled laborers employed in GCC nations. Additionally, a sizeable portion of migrants are employed in industrialized nations like the UK and the US. These nations have seen the development of a highly robust Indian Diaspora. India's development is largely dependent on the generosity of its diaspora, and the nation has benefited greatly from the substantial contributions made by several emigrant generations. Its strength was noticeable during the COVID-19 and Kerala floods. Millions of people were displaced, millions of properties were damaged, and many people died as a result of the 2018 Kerala floods. The Malayalee diaspora played a crucial role in the reconstruction of Kerala by providing support for the rescue efforts underway on the ground through their extensive worldwide network. During COVID-19, an analogous outreach was also noted, in which the diaspora assisted stranded migrants across the globe. Together with the work the diaspora has done for the state's development and recovery, there has also been a recent outpouring of assistance during the COVID-19 pandemic. The study focuses on the subtleties of diaspora philanthropic scholarship and how Kerala was able to recover from the COVID-19 pandemic and floods thanks to it. Semi-structured in-depth interviews with migrants, migrant organizations, and beneficiaries from the diaspora through snowball sampling to better understand the role that diaspora philanthropy plays in times of crisis.

Keywords: crises, diaspora, remittances, COVID-19, flood, economic development of Kerala

Procedia PDF Downloads 32
613 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 111
612 Low Temperature Biological Treatment of Chemical Oxygen Demand for Agricultural Water Reuse Application Using Robust Biocatalysts

Authors: Vedansh Gupta, Allyson Lutz, Ameen Razavi, Fatemeh Shirazi

Abstract:

The agriculture industry is especially vulnerable to forecasted water shortages. In the fresh and fresh-cut produce sector, conventional flume-based washing with recirculation exhibits high water demand. This leads to a large water footprint and possible cross-contamination of pathogens. These can be alleviated through advanced water reuse processes, such as membrane technologies including reverse osmosis (RO). Water reuse technologies effectively remove dissolved constituents but can easily foul without pre-treatment. Biological treatment is effective for the removal of organic compounds responsible for fouling, but not at the low temperatures encountered at most produce processing facilities. This study showed that the Microvi MicroNiche Engineering (MNE) technology effectively removes organic compounds (> 80%) at low temperatures (6-8 °C) from wash water. The MNE technology uses synthetic microorganism-material composites with negligible solids production, making it advantageously situated as an effective bio-pretreatment for RO. A preliminary technoeconomic analysis showed 60-80% savings in operation and maintenance costs (OPEX) when using the Microvi MNE technology for organics removal. This study and the accompanying economic analysis indicated that the proposed technology process will substantially reduce the cost barrier for adopting water reuse practices, thereby contributing to increased food safety and furthering sustainable water reuse processes across the agricultural industry.

Keywords: biological pre-treatment, innovative technology, vegetable processing, water reuse, agriculture, reverse osmosis, MNE biocatalysts

Procedia PDF Downloads 129
611 Cell-Based and Exosome Treatments for Hair Restoration

Authors: Armin Khaghani Boroujeni, Leila Dehghani, Parham Talebi Boroujeni, Sahar Rostamian, Ali Asilian

Abstract:

Background: Hair loss is a common complaint observed in both genders. Androgenetic alopecia is known pattern for hair loss. To assess new regenerative strategies (PRP, A-SC-BT, conditioned media, exosome-based treatments) compared to conventional therapies for hair loss or hair regeneration, an updated review was undertaken. To address this issue, we carried out this systematic review to comprehensively evaluate the efficacy of cell-based therapies on hair loss. Methods: The available online databases, including ISI Web of Science, Scopus, and PubMed, were searched systematically up to February 2022. The quality assessment of included studies was done using the Cochrane Collaboration's tool. Results: As a result, a total of 90 studies involving 2345 participants were included in the present study. The enrolled studies were conducted between 2010 and 2022. The subjects’ mean age ranged from 19 to 55.11 years old. Approaches using platelet rich plasma (PRP) provide a beneficial impact on hair regrowth. However, other cell-based therapies, including stem cell transplant, stem cell-derived conditioned medium, and stem cell-derived exosomes, revealed conflicting evidence. Conclusion: However, cell-based therapies for hair loss are still in their infancy, and more robust clinical studies are needed to better evaluate their mechanisms of action, efficacy, safety, benefits, and limitations. In this review, we provide the resources to the latest clinical studies and a more detailed description of the latest clinical studies concerning cell-based therapies in hair loss.

Keywords: cell-based therapy, exosome, hair restoration, systematic review

Procedia PDF Downloads 76
610 Hexane Extract of Thymus serpyllum L.: GC-MS Profile, Antioxidant Potential and Anticancer Impact on HepG2 (Liver Carcinoma) Cell Line

Authors: Salma Baig, Bakrudeen Ali Ahmad, Ainnul Hamidah Syahadah Azizan, Hapipah Mohd Ali, Elham Rouhollahi, Mahmood Ameen Abdulla

Abstract:

Free radical damage induced by reactive oxygen species (ROS) contributes to etiology of many chronic diseases, cancer being one of them. Recent studies have been successful in ROS targeted therapies via antioxidants using mouse models in cancer therapeutics. The present study was designed to scrutinize anticancer activity, antioxidant activity of 5 different extracts of Thymus serpyllum in MDA-MB-231, MCF-7, HepG2, HCT-116, PC3, and A549. Identification of the phytochemicals present in the most active extract of Thymus serpyllum was conducted using gas chromatography coupled with mass spectrophotometry and antioxidant activity was measured by using DPPH radical scavenging and FRAP assay. Anticancer impact of the extract in terms of IC50 was evaluated using MTT cell viability assay. Results revealed that the hexane extract showed the best anticancer activity in HepG2 (Liver Carcinoma Cell Line) with an IC50 value of 23 ± 0.14 µg/ml followed by 25 µg/ml in HCT-116 (Colon Cancer Cell Line), 30 µm/ml in MCF-7 (Breast Cancer Cell Line), 35 µg/ml in MDA-MB-231 (Breast Cancer Cell Line), 57 µg/ml in PC3 (Prostate Cancer Cell Line) and 60 µg/ml in A549 (Lung Carcinoma Cell Line). GC-MS profile of the hexane extract showed the presence of 31 compounds with carvacrol, thymol and thymoquione being the major compounds. Phenolics such as Vitamin E, terpinen-4-ol, borneol and phytol were also identified. Hence, here we present the first report on cytotoxicity of hexane extract of Thymus serpyllum extract in HepG2 cell line with a robust anticancer activity with an IC50 of 23 ± 0.14 µg/ml.

Keywords: Thymus serpyllum L., hexane extract, GC-MS profile, antioxidant activity, anticancer activity, HepG2 cell line

Procedia PDF Downloads 517
609 A Context Aware Mobile Learning System with a Cognitive Recommendation Engine

Authors: Jalal Maqbool, Gyu Myoung Lee

Abstract:

Using smart devices for context aware mobile learning is becoming increasingly popular. This has led to mobile learning technology becoming an indispensable part of today’s learning environment and platforms. However, some fundamental issues remain - namely, mobile learning still lacks the ability to truly understand human reaction and user behaviour. This is due to the fact that current mobile learning systems are passive and not aware of learners’ changing contextual situations. They rely on static information about mobile learners. In addition, current mobile learning platforms lack the capability to incorporate dynamic contextual situations into learners’ preferences. Thus, this thesis aims to address these issues highlighted by designing a context aware framework which is able to sense learner’s contextual situations, handle data dynamically, and which can use contextual information to suggest bespoke learning content according to a learner’s preferences. This is to be underpinned by a robust recommendation system, which has the capability to perform these functions, thus providing learners with a truly context-aware mobile learning experience, delivering learning contents using smart devices and adapting to learning preferences as and when it is required. In addition, part of designing an algorithm for the recommendation engine has to be based on learner and application needs, personal characteristics and circumstances, as well as being able to comprehend human cognitive processes which would enable the technology to interact effectively and deliver mobile learning content which is relevant, according to the learner’s contextual situations. The concept of this proposed project is to provide a new method of smart learning, based on a capable recommendation engine for providing an intuitive mobile learning model based on learner actions.

Keywords: aware, context, learning, mobile

Procedia PDF Downloads 245
608 Software-Defined Architecture and Front-End Optimization for DO-178B Compliant Distance Measuring Equipment

Authors: Farzan Farhangian, Behnam Shakibafar, Bobda Cedric, Rene Jr. Landry

Abstract:

Among the air navigation technologies, many of them are capable of increasing aviation sustainability as well as accuracy improvement in Alternative Positioning, Navigation, and Timing (APNT), especially avionics Distance Measuring Equipment (DME), Very high-frequency Omni-directional Range (VOR), etc. The integration of these air navigation solutions could make a robust and efficient accuracy in air mobility, air traffic management and autonomous operations. Designing a proper RF front-end, power amplifier and software-defined transponder could pave the way for reaching an optimized avionics navigation solution. In this article, the possibility of reaching an optimum front-end to be used with single low-cost Software-Defined Radio (SDR) has been investigated in order to reach a software-defined DME architecture. Our software-defined approach uses the firmware possibilities to design a real-time software architecture compatible with a Multi Input Multi Output (MIMO) BladeRF to estimate an accurate time delay between a Transmission (Tx) and the reception (Rx) channels using the synchronous scheduled communication. We could design a novel power amplifier for the transmission channel of the DME to pass the minimum transmission power. This article also investigates designing proper pair pulses based on the DO-178B avionics standard. Various guidelines have been tested, and the possibility of passing the certification process for each standard term has been analyzed. Finally, the performance of the DME was tested in the laboratory environment using an IFR6000, which showed that the proposed architecture reached an accuracy of less than 0.23 Nautical mile (Nmi) with 98% probability.

Keywords: avionics, DME, software defined radio, navigation

Procedia PDF Downloads 80
607 Application of the Urban Forest Credit Standard as a Tool for Compensating CO2 Emissions in the Metalworking Industry: A Case Study in Brazil

Authors: Marie Madeleine Sarzi Inacio, Ligiane Carolina Leite Dauzacker, Rodrigo Henriques Lopes Da Silva

Abstract:

The climate changes resulting from human activity have increased interest in more sustainable production practices to reduce and offset pollutant emissions. Brazil, with its vast areas capable of carbon absorption, holds a significant advantage in this context. However, to optimize the country's sustainable potential, it is important to establish a robust carbon market with clear rules for the eligibility and validation of projects aimed at reducing and offsetting Greenhouse Gas (GHG) emissions. In this study, our objective is to conduct a feasibility analysis through a case study to evaluate the implementation of an urban forest credits standard in Brazil, using the Urban Forest Credits (UFC) model implemented in the United States as a reference. Thus, the city of Ribeirão Preto, located in Brazil, was selected to assess the availability of green areas. With the CO2 emissions value from the metalworking industry, it was possible to analyze information in the case study, considering the activity. The QGIS software was used to map potential urban forest areas, which can connect to various types of geospatial databases. Although the chosen municipality has little vegetative coverage, the mapping identified at least eight areas that fit the standard definitions within the delimited urban perimeter. The outlook was positive, and the implementation of projects like Urban Forest Credits (UFC) adapted to the Brazilian reality has great potential to benefit the country in the carbon market and contribute to achieving its Greenhouse Gas (GHG) emission reduction goals.

Keywords: carbon neutrality, metalworking industry, carbon credits, urban forestry credits

Procedia PDF Downloads 77
606 Process Development of pVAX1/lacZ Plasmid DNA Purification Using Design of Experiment

Authors: Asavasereerat K., Teacharsripaitoon T., Tungyingyong P., Charupongrat S., Noppiboon S. Hochareon L., Kitsuban P.

Abstract:

Third generation of vaccines is based on gene therapy where DNA is introduced into patients. The antigenic or therapeutic proteins encoded from transgenes DNA triggers an immune-response to counteract various diseases. Moreover, DNA vaccine offers the customization of its ability on protection and treatment with high stability. The production of DNA vaccines become of interest. According to USFDA guidance for industry, the recommended limits for impurities from host cell are lower than 1%, and the active conformation homogeneity supercoiled DNA, is more than 80%. Thus, the purification strategy using two-steps chromatography has been established and verified for its robustness. Herein, pVax1/lacZ, a pre-approved USFDA DNA vaccine backbone, was used and transformed into E. coli strain DH5α. Three purification process parameters including sample-loading flow rate, the salt concentration in washing and eluting buffer, were studied and the experiment was designed using response surface method with central composite face-centered (CCF) as a model. The designed range of selected parameters was 10% variation from the optimized set point as a safety factor. The purity in the percentage of supercoiled conformation obtained from each chromatography step, AIEX and HIC, were analyzed by HPLC. The response data were used to establish regression model and statistically analyzed followed by Monte Carlo simulation using SAS JMP. The results on the purity of the product obtained from AIEX and HIC are between 89.4 to 92.5% and 88.3 to 100.0%, respectively. Monte Carlo simulation showed that the pVAX1/lacZ purification process is robust with confidence intervals of 0.90 in range of 90.18-91.00% and 95.88-100.00%, for AIEX and HIC respectively.

Keywords: AIEX, DNA vaccine, HIC, puification, response surface method, robustness

Procedia PDF Downloads 209