Search results for: artificial microRNA approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15524

Search results for: artificial microRNA approach

14084 Exploring Mtb-Mle Practices in Selected Schools in Benguet, Philippines

Authors: Jocelyn L. Alimondo, Juna O. Sabelo

Abstract:

This study explored the MTB-MLE implementation practices of teachers in one monolingual elementary school and one multilingual elementary school in Benguet, Philippines. It used phenomenological approach employing participant-observation, focus group discussion and individual interview. Data were gathered using a video camera, an audio recorder, and an FGD guide and were treated through triangulation and coding. From the data collected, varied ways in implementing the MTB-MLE program were noted. These are: Teaching using a hybrid first language, teaching using a foreign LOI, using translation and multilingual instruction, and using L2/L3 to unlock L1. However, these practices come with challenges such as the a conflict between the mandated LOI and what pupils need, lack of proficiency of teachers in the mandated LOI, facing unreceptive parents, stagnation of knowledge resulting from over-familiarity of input, and zero learning resulting from an incomprehensible language input. From the practices and challenges experienced by the teachers, a model of MTB-MLE approach, the 3L-in-one approach, to teaching was created to illustrate the practice which teachers claimed to be the best way to address the challenges besetting them while at the same time satisfying the academic needs of their pupils. From the findings, this paper concludes that despite the challenges besetting the teachers, they still displayed creativity in coming up with relevant teaching practices, the unreceptiveness of some teachers and parents sprung from the fact that they do not understand the real concept of MTB-MLE, greater challenges are being faced by teachers in multilingual school due to the diverse linguistic background of their clients, and the most effective approach in implementing MTB-MLE is the multilingual approach, allowing the use of the pupils’ mother tongue, L2 (Filipino), L3 (English), and other languages familiar to the students.

Keywords: MTB-MLE Philippines, MTB-MLE model, first language, multilingual instruction

Procedia PDF Downloads 424
14083 Design and Optimization of a Small Hydraulic Propeller Turbine

Authors: Dario Barsi, Marina Ubaldi, Pietro Zunino, Robert Fink

Abstract:

A design and optimization procedure is proposed and developed to provide the geometry of a high efficiency compact hydraulic propeller turbine for low head. For the preliminary design of the machine, classic design criteria, based on the use of statistical correlations for the definition of the fundamental geometric parameters and the blade shapes are used. These relationships are based on the fundamental design parameters (i.e., specific speed, flow coefficient, work coefficient) in order to provide a simple yet reliable procedure. Particular attention is paid, since from the initial steps, on the correct conformation of the meridional channel and on the correct arrangement of the blade rows. The preliminary geometry thus obtained is used as a starting point for the hydrodynamic optimization procedure, carried out using a CFD calculation software coupled with a genetic algorithm that generates and updates a large database of turbine geometries. The optimization process is performed using a commercial approach that solves the turbulent Navier Stokes equations (RANS) by exploiting the axial-symmetric geometry of the machine. The geometries generated within the database are therefore calculated in order to determine the corresponding overall performance. In order to speed up the optimization calculation, an artificial neural network (ANN) based on the use of an objective function is employed. The procedure was applied for the specific case of a propeller turbine with an innovative design of a modular type, specific for applications characterized by very low heads. The procedure is tested in order to verify its validity and the ability to automatically obtain the targeted net head and the maximum for the total to total internal efficiency.

Keywords: renewable energy conversion, hydraulic turbines, low head hydraulic energy, optimization design

Procedia PDF Downloads 150
14082 Implementation of a PDMS Microdevice for the Improved Purification of Circulating MicroRNAs

Authors: G. C. Santini, C. Potrich, L. Lunelli, L. Vanzetti, S. Marasso, M. Cocuzza, C. Pederzolli

Abstract:

The relevance of circulating miRNAs as non-invasive biomarkers for several pathologies is nowadays undoubtedly clear, as they have been found to have both diagnostic and prognostic value able to add fundamental information to patients’ clinical picture. The availability of these data, however, relies on a time-consuming process spanning from the sample collection and processing to the data analysis. In light of this, strategies which are able to ease this procedure are in high demand and considerable effort have been made in developing Lab-on-a-chip (LOC) devices able to speed up and standardise the bench work. In this context, a very promising polydimethylsiloxane (PDMS)-based microdevice which integrates the processing of the biological sample, i.e. purification of extracellular miRNAs, and reverse transcription was previously developed in our lab. In this study, we aimed at the improvement of the miRNA extraction performances of this micro device by increasing the ability of its surface to absorb extracellular miRNAs from biological samples. For this purpose, we focused on the modulation of two properties of the material: roughness and charge. PDMS surface roughness was modulated by casting with several templates (terminated with silicon oxide coated by a thin anti-adhesion aluminum layer), followed by a panel of curing conditions. Atomic force microscopy (AFM) was employed to estimate changes at the nanometric scale. To introduce modifications in surface charge we functionalized PDMS with different mixes of positively charged 3-aminopropyltrimethoxysilanes (APTMS) and neutral poly(ethylene glycol) silane (PEG). The surface chemical composition was characterized by X-ray photoelectron spectroscopy (XPS) and the number of exposed primary amines was quantified with the reagent sulfosuccinimidyl-4-o-(4,4-dimethoxytrityl) butyrate (s-SDTB). As our final end point, the adsorption rate of all these different conditions was assessed by fluorescence microscopy by incubating a synthetic fluorescently-labeled miRNA. Our preliminary analysis identified casting on thermally grown silicon oxide, followed by a curing step at 85°C for 1 hour, as the most efficient technique to obtain a PDMS surface roughness in the nanometric scaleable to trap miRNA. In addition, functionalisation with 0.1% APTMS and 0.9% PEG was found to be a necessary step to significantly increase the amount of microRNA adsorbed on the surface, therefore, available for further steps as on-chip reverse transcription. These findings show a substantial improvement in the extraction efficiency of our PDMS microdevice, ultimately leading to an important step forward in the development of an innovative, easy-to-use and integrated system for the direct purification of less abundant circulating microRNAs.

Keywords: circulating miRNAs, diagnostics, Lab-on-a-chip, polydimethylsiloxane (PDMS)

Procedia PDF Downloads 318
14081 A Mathematical Model Approach Regarding the Children’s Height Development with Fractional Calculus

Authors: Nisa Özge Önal, Kamil Karaçuha, Göksu Hazar Erdinç, Banu Bahar Karaçuha, Ertuğrul Karaçuha

Abstract:

The study aims to use a mathematical approach with the fractional calculus which is developed to have the ability to continuously analyze the factors related to the children’s height development. Until now, tracking the development of the child is getting more important and meaningful. Knowing and determining the factors related to the physical development of the child any desired time would provide better, reliable and accurate results for childcare. In this frame, 7 groups for height percentile curve (3th, 10th, 25th, 50th, 75th, 90th, and 97th) of Turkey are used. By using discrete height data of 0-18 years old children and the least squares method, a continuous curve is developed valid for any time interval. By doing so, in any desired instant, it is possible to find the percentage and location of the child in Percentage Chart. Here, with the help of the fractional calculus theory, a mathematical model is developed. The outcomes of the proposed approach are quite promising compared to the linear and the polynomial method. The approach also yields to predict the expected values of children in the sense of height.

Keywords: children growth percentile, children physical development, fractional calculus, linear and polynomial model

Procedia PDF Downloads 148
14080 Bi-Criteria Vehicle Routing Problem for Possibility Environment

Authors: Bezhan Ghvaberidze

Abstract:

A multiple criteria optimization approach for the solution of the Fuzzy Vehicle Routing Problem (FVRP) is proposed. For the possibility environment the levels of movements between customers are calculated by the constructed simulation interactive algorithm. The first criterion of the bi-criteria optimization problem - minimization of the expectation of total fuzzy travel time on closed routes is constructed for the FVRP. A new, second criterion – maximization of feasibility of movement on the closed routes is constructed by the Choquet finite averaging operator. The FVRP is reduced to the bi-criteria partitioning problem for the so called “promising” routes which were selected from the all admissible closed routes. The convenient selection of the “promising” routes allows us to solve the reduced problem in the real-time computing. For the numerical solution of the bi-criteria partitioning problem the -constraint approach is used. An exact algorithm is implemented based on D. Knuth’s Dancing Links technique and the algorithm DLX. The Main objective was to present the new approach for FVRP, when there are some difficulties while moving on the roads. This approach is called FVRP for extreme conditions (FVRP-EC) on the roads. Also, the aim of this paper was to construct the solving model of the constructed FVRP. Results are illustrated on the numerical example where all Pareto-optimal solutions are found. Also, an approach for more complex model FVRP with time windows was developed. A numerical example is presented in which optimal routes are constructed for extreme conditions on the roads.

Keywords: combinatorial optimization, Fuzzy Vehicle routing problem, multiple objective programming, possibility theory

Procedia PDF Downloads 485
14079 Neural Network Approach For Clustering Host Community: Based on Perceptions Toward Tourism, Their Satisfaction Level and Demographic Attributes in Iran (Lahijan)

Authors: Nasibeh Mohammadpour, Ali Rajabzadeh, Adel Azar, Hamid Zargham Borujeni,

Abstract:

Generally, various industries development depends on their stakeholders and beneficiaries supports. One of the most important stakeholders in tourism industry ( which has become one of the most important lucrative and employment-generating activities at the international level these days) are host communities in tourist destination which are affected and effect on this industry development. Recognizing host community and its segmentations can be important to get their support for future decisions and policy making. In order to identify these segments, in this study, clustering of the residents has been done by using some tools that are designed to encounter human complexities and have ability to model and generalize complex systems without any needs for the initial clusters’ seeds like classic methods. Neural networks can help to meet these expectations. The research have been planned to design neural networks-based mathematical model for clustering the host community effectively according to multi criteria, and identifies differences among segments. In order to achieve this goal, the residents’ segmentation has been done by demographic characteristics, their attitude towards the tourism development, the level of satisfaction and the type of their support in this field. The applied method is self-organized neural networks and the results have compared with K-means. As the results show, the use of Self- Organized Map (SOM) method provides much better results by considering the Cophenetic correlation and between clusters variance coefficients. Based on these criteria, the host community is divided into five sections with unique and distinctive features, which are in the best condition (in comparison other modes) according to Cophenetic correlation coefficient of 0.8769 and between clusters variance of 0.1412.

Keywords: Artificial Nural Network, Clustering , Resident, SOM, Tourism

Procedia PDF Downloads 183
14078 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach

Authors: Utkarsh A. Mishra, Ankit Bansal

Abstract:

At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.

Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks

Procedia PDF Downloads 223
14077 Green Initiative and Marketing Approach: Developing a Better Marketing Approach of Green Initiatives by an Apparel Brand

Authors: Vaishali Joshi, Pallav Joshi

Abstract:

Environment concern has become an important topic and continues to acquire more popularity in the coming scenario. We all are exposed to messages daily, which encourage us to involve in green behavior. Factors such as Global Warming, Climate change are creating a big buzz amongst the people. Realizing this, many firms/companies are adopting the bright way of making profit along with creating a brand image, by going green. These firms/companies persuade consumers to use purchase eco-friendly products for the benefit of the environment and the society. In such scenario, it becomes very essential for such firms/companies to approach the customers in a better way. In other words, we can say that marketing approach plays a crucial role for such firm/companies. Hence in this research study, we have tried to create a marketing approach for the firms/companies for selling the eco-friendly apparels. We have studied the hypothetical apparel brand who has taken a green initiative of making their products eco-friendly. We have named this hypothetical brand as “Go-Green”. By taking this hypothetical brand we have studied about how this brand can achieve better marketing approach. In particular, we have studied the four types of print advertisements of this brand as follows :(i) print advertisement showing only eco-friendly apparel (ii) print advertisement showing eco-friendly apparel labeled with eco-label (iii) print advertisement showing eco-friendly apparel along with information about the benefit of the featured apparel and (iv) print advertisement showing eco-friendly apparel with both eco-label and information about the benefit of the featured apparel. The conclusion of this research suggest that respondents more positively evaluate the print advertisement of eco-friendly apparel labeled with eco-labels and information about the benefit of the featured apparel, compared by other three print advertisement. Moreover, in this research study, we have studied environment knowledge, as the moderating factor affecting the consumer green purchase behavior.

Keywords: eco-friendly apparel, print advertisement, eco-label, environment knowledge

Procedia PDF Downloads 286
14076 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine

Authors: Adriana Haulica

Abstract:

Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.

Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics

Procedia PDF Downloads 70
14075 Exploring Visual Arts through the Blue Humanities: The Case Study of Jason deCaires Taylor's Underwater Sculptures

Authors: Mohammed Muharram

Abstract:

The Blue Humanities aims to deepen our understanding of the oceans through the integration of arts and sciences, emphasizing their cultural, historical, and ecological significance. This study explores the role of visual arts within this interdisciplinary framework, focusing on the underwater sculptures of Jason deCaires Taylor as a case study. The research employs a multidisciplinary approach, combining art history, environmental science, and cultural studies to explore the significance of Taylor's underwater installations. Methodologies include analysis of the artistic elements and themes in Taylor's work, assessment of the ecological impact of the sculptures on marine environments, and examination of the cultural narratives they evoke. Key findings highlight how Taylor's sculptures serve as artificial reefs, promoting marine biodiversity while simultaneously raising awareness about ocean conservation. The artworks act as powerful symbols, merging environmental activism with artistic expression and transforming underwater spaces into immersive art galleries that challenge traditional notions of viewing art. By bridging the gap between visual arts and environmental science, this study demonstrates how Taylor's sculptures contribute to the Blue Humanities by fostering a deeper, more holistic appreciation of the marine world. The research advocates for the continued integration of artistic perspectives into marine conservation efforts, emphasizing the role of visual arts in shaping public perceptions and promoting ecological sustainability. In conclusion, this study underscores the transformative potential of visual arts within the Blue Humanities, exemplified by Jason deCaires Taylor's underwater sculptures, which inspire both aesthetic appreciation and environmental consciousness.

Keywords: blue humanities, visual art, underwater sculptures, Jason deCaires Taylor

Procedia PDF Downloads 26
14074 Optimizing University Administration in a Globalized World: Leveraging AI and ICT for Enhanced Governance and Sustainability in Higher Education

Authors: Ikechukwu Ogeze Ukeje, Chinyere Ori Elom, Chukwudum Collins Umoke

Abstract:

This study explores the challenges in the integration of Artificial Intelligence (AI) and Information and Communication Technology (ICT) practices in enhancing governance and sustainable solution modeling in higher education, focusing on Alex Ekwueme Federal University Ndufu-Alike (AE-FUNAI), Nigeria. In the context of a developing country like Nigeria, leveraging AI and ICT tools presents a unique opportunity to improve teaching, learning, administrative processes, and governance. The research aims to evaluate how AI and ICT technologies can contribute to sustainable educational practices, enhance decision-making processes, and improve engagement among key stakeholders: students, lecturers, and administrative staff. Students are involved to provide insights into their interactions with AI and ICT tools, particularly in learning and participation in governance. Lecturers’ perspectives will offer a view into how these technologies influence teaching, research, and curriculum development. Administrative staff will provide a crucial understanding of how AI and ICT tools can streamline operations, support data-driven governance, and enhance institutional efficiency. This study will use a mixed-method approach to collect both qualitative and quantitative data. The finding of this study is geared towards shaping the future of education in Nigeria and beyond by developing an Inclusive AI-governance Integration Framework (I-AIGiF) for enhanced performance in the system. Examining the roles of these stakeholder groups, this research could guide the development of policies for more effective AI and ICT integration, leading to sustainable educational innovation and governance.

Keywords: university administration, AI, higher education governance, education sustainability, ICT challenges

Procedia PDF Downloads 20
14073 Global Low Carbon Transitions in the Power Sector: A Machine Learning Archetypical Clustering Approach

Authors: Abdullah Alotaiq, David Wallom, Malcolm McCulloch

Abstract:

This study presents an archetype-based approach to designing effective strategies for low-carbon transitions in the power sector. To achieve global energy transition goals, a renewable energy transition is critical, and understanding diverse energy landscapes across different countries is essential to design effective renewable energy policies and strategies. Using a clustering approach, this study identifies 12 energy archetypes based on the electricity mix, socio-economic indicators, and renewable energy contribution potential of 187 UN countries. Each archetype is characterized by distinct challenges and opportunities, ranging from high dependence on fossil fuels to low electricity access, low economic growth, and insufficient contribution potential of renewables. Archetype A, for instance, consists of countries with low electricity access, high poverty rates, and limited power infrastructure, while Archetype J comprises developed countries with high electricity demand and installed renewables. The study findings have significant implications for renewable energy policymaking and investment decisions, with policymakers and investors able to use the archetype approach to identify suitable renewable energy policies and measures and assess renewable energy potential and risks. Overall, the archetype approach provides a comprehensive framework for understanding diverse energy landscapes and accelerating decarbonisation of the power sector.

Keywords: fossil fuels, power plants, energy transition, renewable energy, archetypes

Procedia PDF Downloads 51
14072 Modified Model-Based Systems Engineering Driven Approach for Defining Complex Energy Systems

Authors: Akshay S. Dalvi, Hazim El-Mounayri

Abstract:

The internal and the external interactions between the complex structural and behavioral characteristics of the complex energy system result in unpredictable emergent behaviors. These emergent behaviors are not well understood, especially when modeled using the traditional top-down systems engineering approach. The intrinsic nature of current complex energy systems has called for an elegant solution that provides an integrated framework in Model-Based Systems Engineering (MBSE). This paper mainly presents a MBSE driven approach to define and handle the complexity that arises due to emergent behaviors. The approach provides guidelines for developing system architecture that leverages in predicting the complexity index of the system at different levels of abstraction. A framework that integrates indefinite and definite modeling aspects is developed to determine the complexity that arises during the development phase of the system. This framework provides a workflow for modeling complex systems using Systems Modeling Language (SysML) that captures the system’s requirements, behavior, structure, and analytical aspects at both problem definition and solution levels. A system architecture for a district cooling plant is presented, which demonstrates the ability to predict the complexity index. The result suggests that complex energy systems like district cooling plant can be defined in an elegant manner using the unconventional modified MBSE driven approach that helps in estimating development time and cost.

Keywords: district cooling plant, energy systems, framework, MBSE

Procedia PDF Downloads 130
14071 Biological Significance of Long Intergenic Noncoding RNA LINC00273 in Lung Cancer Cell Metastasis

Authors: Ipsita Biswas, Arnab Sarkar, Ashikur Rahaman, Gopeswar Mukherjee, Subhrangsu Chatterjee, Shamee Bhattacharjee, Deba Prasad Mandal

Abstract:

One of the major reasons for the high mortality rate of lung cancer is the substantial delays in disease detection at late metastatic stages. It is of utmost importance to understand the detailed molecular signaling and detect the molecular markers that can be used for the early diagnosis of cancer. Several studies explored the emerging roles of long noncoding RNAs (lncRNAs) in various cancers as well as lung cancer. A long non-coding RNA LINC00273 was recently discovered to promote cancer cell migration and invasion, and its positive correlation with the pathological stages of metastasis may prove it to be a potential target for inhibiting cancer cell metastasis. Comparing real-time expression of LINC00273 in various human clinical cancer tissue samples with normal tissue samples revealed significantly higher expression in cancer tissues. This long intergenic noncoding RNA was found to be highly expressed in human liver tumor-initiating cells, human gastric adenocarcinoma AGS cell line, as well as human non-small cell lung cancer A549 cell line. SiRNA and shRNA-induced knockdown of LINC00273 in both in vitro and in vivo nude mice significantly subsided AGS and A549 cancer cell migration and invasion. LINC00273 knockdown also reduced TGF-β induced SNAIL, SLUG, VIMENTIN, ZEB1 expression, and metastasis in A549 cells. Plenty of reports have suggested the role of microRNAs of the miR200 family in reversing epithelial to mesenchymal transition (EMT) by inhibiting ZEB transcription factors. In this study, hsa-miR-200a-3p was predicted via IntaRNA-Freiburg RNA tools to be a potential target of LINC00273 with a negative free binding energy of −8.793 kcal/mol, and this interaction was verified as a confirmed target of LINC00273 by RNA pulldown, real-time PCR and luciferase assay. Mechanistically, LINC00273 accelerated TGF-β induced EMT by sponging hsa-miR-200a-3p which in turn liberated ZEB1 and promoted prometastatic functions in A549 cells in vitro as verified by real-time PCR and western blotting. The similar expression patterns of these EMT regulatory pathway molecules, viz. LINC00273, hsa-miR-200a-3p, ZEB1 and TGF-β, were also detected in various clinical samples like breast cancer tissues, oral cancer tissues, lung cancer tissues, etc. Overall, this LINC00273 mediated EMT regulatory signaling can serve as a potential therapeutic target for the prevention of lung cancer metastasis.

Keywords: epithelial to mesenchymal transition, long noncoding RNA, microRNA, non-small-cell lung carcinoma

Procedia PDF Downloads 156
14070 Generating Ideas to Improve Road Intersections Using Design with Intent Approach

Authors: Omar Faruqe Hamim, M. Shamsul Hoque, Rich C. McIlroy, Katherine L. Plant, Neville A. Stanton

Abstract:

Road safety has become an alarming issue, especially in low-middle income developing countries. The traditional approaches lack the out of the box thinking, making engineers confined to applying usual techniques in making roads safer. A socio-technical approach has recently been introduced in improving road intersections through designing with intent. This Design With Intent (DWI) approach aims to give practitioners a more nuanced approach to design and behavior, working with people, people’s understanding, and the complexities of everyday human experience. It's a collection of design patterns —and a design and research approach— for exploring the interactions between design and people’s behavior across products, services, and environments, both digital and physical. Through this approach, it can be seen that how designing with people in behavior change can be applied to social and environmental problems, as well as commercially. It has a total of 101 cards across eight different lenses, such as architectural, error-proofing, interaction, ludic, perceptual, cognitive, Machiavellian, and security lens each having its own distinct characteristics of extracting ideas from the participant of this approach. For this research purpose, a three-legged accident blackspot intersection of a national highway has been chosen to perform the DWI workshop. Participants from varying fields such as civil engineering, naval architecture and marine engineering, urban and regional planning, and sociology actively participated for a day long workshop. While going through the workshops, the participants were given a preamble of the accident scenario and a brief overview of DWI approach. Design cards of varying lenses were distributed among 10 participants and given an hour and a half for brainstorming and generating ideas to improve the safety of the selected intersection. After the brainstorming session, the participants spontaneously went through roundtable discussions regarding the ideas they have come up with. According to consensus of the forum, ideas were accepted or rejected. These generated ideas were then synthesized and agglomerated to bring about an improvement scheme for the intersection selected in our study. To summarize the improvement ideas from DWI approach, color coding of traffic lanes for separate vehicles, channelizing the existing bare intersection, providing advance warning traffic signs, cautionary signs and educational signs motivating road users to drive safe, using textured surfaces at approach with rumble strips before the approach of intersection were the most significant one. The motive of this approach is to bring about new ideas from the road users and not just depend on traditional schemes to increase the efficiency, safety of roads as well and to ensure the compliance of road users since these features are being generated from the minds of users themselves.

Keywords: design with intent, road safety, human experience, behavior

Procedia PDF Downloads 139
14069 Modeling the Compound Interest Dynamics Using Fractional Differential Equations

Authors: Muath Awadalla, Maen Awadallah

Abstract:

Banking sector covers different activities including lending money to customers. However, it is commonly known that customers pay money they have borrowed including an added amount called interest. Compound interest rate is an approach used in determining the interest to be paid. The instant compounded amount to be paid by a debtor is obtained through a differential equation whose main parameters are the rate and the time. The rate used by banks in a country is often defined by the government of the said country. In Switzerland, for instance, a negative rate was once applied. In this work, a new approach of modeling the compound interest is proposed using Hadamard fractional derivative. As a result, it appears that depending on the fraction value used in derivative the amount to be paid by a debtor might either be higher or lesser than the amount determined using the classical approach.

Keywords: compound interest, fractional differential equation, hadamard fractional derivative, optimization

Procedia PDF Downloads 126
14068 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning

Authors: Walid Cherif

Abstract:

Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.

Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification

Procedia PDF Downloads 465
14067 A Semiparametric Approach to Estimate the Mode of Continuous Multivariate Data

Authors: Tiee-Jian Wu, Chih-Yuan Hsu

Abstract:

Mode estimation is an important task, because it has applications to data from a wide variety of sources. We propose a semi-parametric approach to estimate the mode of an unknown continuous multivariate density function. Our approach is based on a weighted average of a parametric density estimate using the Box-Cox transform and a non-parametric kernel density estimate. Our semi-parametric mode estimate improves both the parametric- and non-parametric- mode estimates. Specifically, our mode estimate solves the non-consistency problem of parametric mode estimates (at large sample sizes) and reduces the variability of non-parametric mode estimates (at small sample sizes). The performance of our method at practical sample sizes is demonstrated by simulation examples and two real examples from the fields of climatology and image recognition.

Keywords: Box-Cox transform, density estimation, mode seeking, semiparametric method

Procedia PDF Downloads 285
14066 Employing a System of Systems Approach in the Maritime RobotX Challenge: Incorporating Information Technology Students in the Development of an Autonomous Catamaran

Authors: Adam Jenkins

Abstract:

The Maritime RobotX Challenge provides a platform for postgraduate students conducting research in autonomous robotic systems to participate in an international competition. Although targeted to postgraduate students, the problem domain lends itself to a wide range of different levels of student expertise. In 2022, undergraduate Information Technology students from the University of South Australia undertook the challenge, utilizing a System of the Systems approach to the project's architecture. Each student group produced an independent solution to an identified task, which was then implemented on a Single Board Computer (SBC). A Central Control System then engaged each solution when appropriate, allowing the encapsulated SBC systems to manage each task as it was encountered. This approach facilitated collaboration among the multiple independent student teams over an 18-month period, and the fundamental system-agnostic architecture allowed for both the variance in student solutions and the limitations caused by the global electronics shortage. By adopting this approach, Information Technology teams were able to work independently yet produce an effective solution, leveraging their expertise to develop and construct an autonomous catamaran capable of meeting the competition's demanding requirements while producing a high level of engagement. The System of Systems approach is recommended to other universities interested in competing at this level and engaging students in a real-world problem.

Keywords: case study, robotics, education, programming, system of systems, multi-disciplinary collaboration

Procedia PDF Downloads 76
14065 A Hybrid Feature Selection Algorithm with Neural Network for Software Fault Prediction

Authors: Khalaf Khatatneh, Nabeel Al-Milli, Amjad Hudaib, Monther Ali Tarawneh

Abstract:

Software fault prediction identify potential faults in software modules during the development process. In this paper, we present a novel approach for software fault prediction by combining a feedforward neural network with particle swarm optimization (PSO). The PSO algorithm is employed as a feature selection technique to identify the most relevant metrics as inputs to the neural network. Which enhances the quality of feature selection and subsequently improves the performance of the neural network model. Through comprehensive experiments on software fault prediction datasets, the proposed hybrid approach achieves better results, outperforming traditional classification methods. The integration of PSO-based feature selection with the neural network enables the identification of critical metrics that provide more accurate fault prediction. Results shows the effectiveness of the proposed approach and its potential for reducing development costs and effort by detecting faults early in the software development lifecycle. Further research and validation on diverse datasets will help solidify the practical applicability of the new approach in real-world software engineering scenarios.

Keywords: feature selection, neural network, particle swarm optimization, software fault prediction

Procedia PDF Downloads 95
14064 Isolation and Culture of Keratinocytes and Fibroblasts to Develop Artificial Skin Equivalent in Cats

Authors: Lavrentiadou S. N., Angelou V., Chatzimisios K., Papazoglou L.

Abstract:

The aim of this study was the isolation and culture of keratinocytes and fibroblasts from feline skin to ultimately create an artificial engineered skin (including dermis and epidermis) useful for the effective treatment of large cutaneous deficits in cats. Epidermal keratinocytes and dermal fibroblasts were freshly isolated from skin biopsies using an 8 mm biopsy punch obtained from 8 healthy cats that had undergone ovariohysterectomy. The owner’s consent was obtained. All cats had a complete blood count and a serum biochemical analysis and were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) preoperatively. The samples were cut into small pieces and incubated with collagenase (2 mg/ml) for 5-6 hours. Following digestion, cutaneous cells were filtered through a 100 μm cell strainer, washed with DMEM, and grown in DMEM supplemented with 10% FBS. The undigested epidermis was washed with DMEM and incubated with 0.05% Trypsin/0.02% EDTA (TE) solution. Keratinocytes recovered in the TE solution were filtered through a 100 μm and a 40 μm cell strainer and, following washing, were grown on a collagen type I matrix in DMEM: F12 (3:1) medium supplemented with 10% FΒS, 1 μm hydrocortisone, 1 μm isoproterenol and 0.1 μm insulin. Both fibroblasts and keratinocytes were grown in a humidified atmosphere with 5% CO2 at 37oC. The medium was changed twice a week and cells were cultured up to passage 4. Cells were grown to 70-85% confluency, at which point they were trypsinized and subcultured in a 1:4 dilution. The majority of the cells in each passage were transferred to a freezing medium and stored at -80oC. Fibroblasts were frozen in DMEM supplemented with 30% FBS and 10% DMSO, whereas keratinocytes were frozen in a complete keratinocyte growth medium supplemented with 10% DMSO. Both cell types were thawed and successfully grown as described above. Therefore, we can create a bank of fibroblasts and keratinocytes, from which we can recover cells for further culture and use for the generation of skin equivalent in vitro. In conclusion, cutaneous cell isolation and cell culture and expansion were successfully developed. To the authors’ best knowledge, this is the first study reporting isolation and culture of keratinocytes and fibroblasts from feline skin. However, these are preliminary results and thus, the development of autologous-engineered feline skin is still in process.

Keywords: cat, fibroblasts, keratinocytes, skin equivalent, wound

Procedia PDF Downloads 108
14063 The Impact of the COVID-19 on the Cybercrimes in Hungary and the Possible Solutions for Prevention

Authors: László Schmidt

Abstract:

Technological and digital innovation is constantly and dynamically evolving, which poses an enormous challenge to both lawmaking and law enforcement. To legislation because artificial intelligence permeates many areas of people’s daily lives that the legislator must regulate. it can see how challenging it is to regulate e.g. self-driving cars/taxis/camions etc. Not to mention cryptocurrencies and Chat GPT, the use of which also requires legislative intervention. Artificial intelligence also poses an extraordinary challenge to law enforcement. In criminal cases, police and prosecutors can make great use of AI in investigations, e.g. in forensics, DNA samples, reconstruction, identification, etc. But it can also be of great help in the detection of crimes committed in cyberspace. In the case of cybercrime, on the one hand, it can be viewed as a new type of crime that can only be committed with the help of information systems, and that has a specific protected legal object, such as an information system or data. On the other hand, it also includes traditional crimes that are much easier to commit with the help of new tools. According to Hungarian Criminal Code section 375 (1), any person who, for unlawful financial gain, introduces data into an information system, or alters or deletes data processed therein, or renders data inaccessible, or otherwise interferes with the functioning of the information system, and thereby causes damage, is guilty of a felony punishable by imprisonment not exceeding three years. The Covid-19 coronavirus epidemic has had a significant impact on our lives and our daily lives. It was no different in the world of crime. With people staying at home for months, schools, restaurants, theatres, cinemas closed, and no travel, criminals have had to change their ways. Criminals were committing crimes online in even greater numbers than before. These crimes were very diverse, ranging from false fundraising, the collection and misuse of personal data, extortion to fraud on various online marketplaces. The most vulnerable age groups (minors and elderly) could be made more aware and prevented from becoming victims of this type of crime through targeted programmes. The aim of the study is to show the Hungarian judicial practice in relation to cybercrime and possible preventive solutions.

Keywords: cybercrime, COVID-19, Hungary, criminal law

Procedia PDF Downloads 60
14062 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 475
14061 Machine Learning Approach for Lateralization of Temporal Lobe Epilepsy

Authors: Samira-Sadat JamaliDinan, Haidar Almohri, Mohammad-Reza Nazem-Zadeh

Abstract:

Lateralization of temporal lobe epilepsy (TLE) is very important for positive surgical outcomes. We propose a machine learning framework to ultimately identify the epileptogenic hemisphere for temporal lobe epilepsy (TLE) cases using magnetoencephalography (MEG) coherence source imaging (CSI) and diffusion tensor imaging (DTI). Unlike most studies that use classification algorithms, we propose an effective clustering approach to distinguish between normal and TLE cases. We apply the famous Minkowski weighted K-Means (MWK-Means) technique as the clustering framework. To overcome the problem of poor initialization of K-Means, we use particle swarm optimization (PSO) to effectively select the initial centroids of clusters prior to applying MWK-Means. We demonstrate that compared to K-means and MWK-means independently, this approach is able to improve the result of a benchmark data set.

Keywords: temporal lobe epilepsy, machine learning, clustering, magnetoencephalography

Procedia PDF Downloads 156
14060 Predictive Analysis of the Stock Price Market Trends with Deep Learning

Authors: Suraj Mehrotra

Abstract:

The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.

Keywords: machine learning, testing set, artificial intelligence, stock analysis

Procedia PDF Downloads 95
14059 Object Oriented Fault Tree Analysis Methodology

Authors: Yi Xiong, Tao Kong

Abstract:

Traditional safety, risk and reliability analysis approaches are problem-oriented, which make it great workload when analyzing complicated and huge system, besides, too much repetitive work would to do if the analyzed system composed by many similar components. It is pressing need an object and function oriented approach to maintain high consistency with problem domain. A new approach is proposed to overcome these shortcomings of traditional approaches, the concepts: class, abstract, inheritance, polymorphism and encapsulation are introduced into FTA and establish the professional class library that the abstractions of physical objects in real word, four areas relevant information also be proposed as the establish help guide. The interaction between classes is completed by the inside or external methods that mapping the attributes to base events through fully search the knowledge base, which forms good encapsulation. The object oriented fault tree analysis system that analyze and evaluate the system safety and reliability according to the original appearance of the problem is set up, where could mapped directly from the class and object to the problem domain of the fault tree analysis. All the system failure situations can be analyzed through this bottom-up fault tree construction approach. Under this approach architecture, FTA approach is developed, which avoids the human influence of the analyst on analysis results. It reveals the inherent safety problems of analyzed system itself and provides a new way of thinking and development for safety analysis. So that object oriented technology in the field of safety applications and development, safety theory is conducive to innovation.

Keywords: FTA, knowledge base, object-oriented technology, reliability analysis

Procedia PDF Downloads 248
14058 Cost-Effective, Accuracy Preserving Scalar Characterization for mmWave Transceivers

Authors: Mohammad Salah Abdullatif, Salam Hajjar, Paul Khanna

Abstract:

The development of instrument grade mmWave transceivers comes with many challenges. A general rule of thumb is that the performance of the instrument must be higher than the performance of the unit under test in terms of accuracy and stability. The calibration and characterizing of mmWave transceivers are important pillars for testing commercial products. Using a Vector Network Analyzer (VNA) with a mixer option has proven a high performance as an approach to calibrate mmWave transceivers. However, this approach comes with a high cost. In this work, a reduced-cost method to calibrate mmWave transceivers is proposed. A comparison between the proposed method and the VNA technology is provided. A demonstration of significant challenges is discussed, and an approach to meet the requirements is proposed.

Keywords: mmWave transceiver, scalar characterization, coupler connection, magic tee connection, calibration, VNA, vector network analyzer

Procedia PDF Downloads 107
14057 Worm Gearing Design Improvement by Considering Varying Mesh Stiffness

Authors: A. H. Elkholy, A. H. Falah

Abstract:

A new approach has been developed to estimate the load share and stress distribution of worm gear sets. The approach is based upon considering the instantaneous tooth meshing stiffness where the worm gear drive was modelled as a series of spur gear slices, and each slice was analyzed separately using the well established formulae of spur gears. By combining the results obtained for all slices, the entire envolute worm gear set loading and stressing was obtained. The geometric modelling method presented, allows tooth elastic deformation and tooth root stresses of worm gear drives under different load conditions to be investigated. On the basis of the method introduced in this study, the instantaneous meshing stiffness and load share were obtained. In comparison with existing methods, this approach has both good analysis accuracy and less computing time.

Keywords: gear, load/stress distribution, worm, wheel, tooth stiffness, contact line

Procedia PDF Downloads 345
14056 Educational Equity in Online Art Education: The Reggio Emilia Approach in White Ant Atelier for Persian-Speaking Children

Authors: Mahsa Mohammadhosseini

Abstract:

This study investigates the effectiveness of adapting the Reggio Emilia approach to online art education, specifically through White Ant Atelier (W.A.A), a virtual art initiative for Persian-speaking children. Employing an action research framework, the study examines the implementation of Reggio Emilia principles via the "Home" art project, which spanned four months and included 16 sessions. The analysis covers 50 artworks produced by participants, including 17 pieces created collaboratively by mothers and their children. The results demonstrate that integrating the Reggio Emilia approach into online platforms significantly improves children's creative expression and engagement. This finding illustrates that virtual education when integrated with child-centered methodologies like Reggio Emilia, can effectively address and reduce educational inequities among Persian-speaking children.

Keywords: Reggio Emilia, online education, art education, educational equity

Procedia PDF Downloads 18
14055 Analysis of Friction Stir Welding Process for Joining Aluminum Alloy

Authors: A. M. Khourshid, I. Sabry

Abstract:

Friction Stir Welding (FSW), a solid state joining technique, is widely being used for joining Al alloys for aerospace, marine automotive and many other applications of commercial importance. FSW were carried out using a vertical milling machine on Al 5083 alloy pipe. These pipe sections are relatively small in diameter, 5mm, and relatively thin walled, 2 mm. In this study, 5083 aluminum alloy pipe were welded as similar alloy joints using (FSW) process in order to investigate mechanical and microstructural properties .rotation speed 1400 r.p.m and weld speed 10,40,70 mm/min. In order to investigate the effect of welding speeds on mechanical properties, metallographic and mechanical tests were carried out on the welded areas. Vickers hardness profile and tensile tests of the joints as a metallurgical feasibility of friction stir welding for joining Al 6061 aluminum alloy welding was performed on pipe with different thickness 2, 3 and 4 mm,five rotational speeds (485,710,910,1120 and 1400) rpm and a traverse speed (4, 8 and 10)mm/min was applied. This work focuses on two methods such as artificial neural networks using software (pythia) and response surface methodology (RSM) to predict the tensile strength, the percentage of elongation and hardness of friction stir welded 6061 aluminum alloy. An artificial neural network (ANN) model was developed for the analysis of the friction stir welding parameters of 6061 pipe. The tensile strength, the percentage of elongation and hardness of weld joints were predicted by taking the parameters Tool rotation speed, material thickness and travel speed as a function. A comparison was made between measured and predicted data. Response surface methodology (RSM) also developed and the values obtained for the response Tensile strengths, the percentage of elongation and hardness are compared with measured values. The effect of FSW process parameter on mechanical properties of 6061 aluminum alloy has been analyzed in detail.

Keywords: friction stir welding (FSW), al alloys, mechanical properties, microstructure

Procedia PDF Downloads 462