Search results for: real rational transfer functions
1251 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images
Authors: Elham Bagheri, Yalda Mohsenzadeh
Abstract:
Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception
Procedia PDF Downloads 921250 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand
Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones
Abstract:
As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem
Procedia PDF Downloads 2491249 A New Co(II) Metal Complex Template with 4-dimethylaminopyridine Organic Cation: Structural, Hirshfeld Surface, Phase Transition, Electrical Study and Dielectric Behavior
Authors: Mohamed dammak
Abstract:
Great attention has been paid to the design and synthesis of novel organic-inorganic compounds in recent decades because of their structural variety and the large diversity of atomic arrangements. In this work, the structure for the novel dimethyl aminopyridine tetrachlorocobaltate (C₇H₁₁N₂)₂CoCl₄ prepared by the slow evaporation method at room temperature has been successfully discussed. The X-ray diffraction results indicate that the hybrid material has a triclinic structure with a P space group and features a 0D structure containing isolated distorted [CoCl₄]2- tetrahedra interposed between [C7H11N²⁻]+ cations forming planes perpendicular to the c axis at z = 0 and z = ½. The effect of the synthesis conditions and the reactants used, the interactions between the cationic planes, and the isolated [CoCl4]2- tetrahedra are employing N-H...Cl and C-H…Cl hydrogen bonding contacts. The inspection of the Hirshfeld surface analysis helps to discuss the strength of hydrogen bonds and to quantify the inter-contacts. A phase transition was discovered by thermal analysis at 390 K, and comprehensive dielectric research was reported, showing a good agreement with thermal data. Impedance spectroscopy measurements were used to study the electrical and dielectric characteristics over a wide range of frequencies and temperatures, 40 Hz–10 MHz and 313–483 K, respectively. The Nyquist plot (Z" versus Z') from the complex impedance spectrum revealed semicircular arcs described by a Cole-Cole model. An electrical circuit consisting of a link of grain and grain boundary elements is employed. The real and imaginary parts of dielectric permittivity, as well as tg(δ) of (C₇H₁₁N₂)₂CoCl₄ at different frequencies, reveal a distribution of relaxation times. The presence of grain and grain boundaries is confirmed by the modulus investigations. Electric and dielectric analyses highlight the good protonic conduction of this material.Keywords: organic-inorganic, phase transitions, complex impedance, protonic conduction, dielectric analysis
Procedia PDF Downloads 851248 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas
Authors: Sahithi Yarlagadda
Abstract:
The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm
Procedia PDF Downloads 1111247 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features
Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh
Abstract:
In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve
Procedia PDF Downloads 2641246 Resilient Leadership in Sustainable Urban Planning: Embracing Change to Shape Future Cities
Authors: Rick Denley
Abstract:
Urban planning today faces unprecedented challenges as cities strive for sustainability in response to climate change, rapid population growth, and the increasing demand for green infrastructure. In this context, effective leadership becomes as essential as innovative design and technology. Rick Denley’s keynote, Resilient Leadership in Sustainable Urban Planning: Embracing Change to Shape Future Cities, focuses on equipping urban planners, academics, and industry leaders with the leadership tools necessary to guide their teams and projects through complex transitions. His session addresses the essential role of leadership in driving sustainable urban transformations, adapting to changing environmental demands, and fostering collaborative approaches to green infrastructure initiatives. Rick’s keynote is grounded in his Change Growth Formula, a practical framework he has developed over years of leading corporate transformations and advising on resilience and growth. His talk will focus on how urban planning professionals can cultivate adaptability, inspire innovative thinking, and lead their teams to achieve impactful urban projects that prioritize sustainable landscapes, water management, and green spaces. Attendees will gain actionable insights on building a resilient mindset, leveraging collaborative partnerships, and aligning urban planning initiatives with environmental goals. This session is aligned with the conference’s objectives to share interdisciplinary knowledge, explore innovative solutions, and address critical challenges in urban landscape and urban planning. Rick’s approach combines insights from leadership theory with real-world applications in urban planning, making his talk relevant for professionals seeking both inspiration and practical tools to lead sustainable transformations.Keywords: resilient leadership, change management, collaborative planning, adaptive leadership, community engagement, leadership in urban design
Procedia PDF Downloads 131245 Contrasted Mean and Median Models in Egyptian Stock Markets
Authors: Mai A. Ibrahim, Mohammed El-Beltagy, Motaz Khorshid
Abstract:
Emerging Markets return distributions have shown significance departure from normality were they are characterized by fatter tails relative to the normal distribution and exhibit levels of skewness and kurtosis that constitute a significant departure from normality. Therefore, the classical Markowitz Mean-Variance is not applicable for emerging markets since it assumes normally-distributed returns (with zero skewness and kurtosis) and a quadratic utility function. Moreover, the Markowitz mean-variance analysis can be used in cases of moderate non-normality and it still provides a good approximation of the expected utility, but it may be ineffective under large departure from normality. Higher moments models and median models have been suggested in the literature for asset allocation in this case. Higher moments models have been introduced to account for the insufficiency of the description of a portfolio by only its first two moments while the median model has been introduced as a robust statistic which is less affected by outliers than the mean. Tail risk measures such as Value-at Risk (VaR) and Conditional Value-at-Risk (CVaR) have been introduced instead of Variance to capture the effect of risk. In this research, higher moment models including the Mean-Variance-Skewness (MVS) and Mean-Variance-Skewness-Kurtosis (MVSK) are formulated as single-objective non-linear programming problems (NLP) and median models including the Median-Value at Risk (MedVaR) and Median-Mean Absolute Deviation (MedMAD) are formulated as a single-objective mixed-integer linear programming (MILP) problems. The higher moment models and median models are compared to some benchmark portfolios and tested on real financial data in the Egyptian main Index EGX30. The results show that all the median models outperform the higher moment models were they provide higher final wealth for the investor over the entire period of study. In addition, the results have confirmed the inapplicability of the classical Markowitz Mean-Variance to the Egyptian stock market as it resulted in very low realized profits.Keywords: Egyptian stock exchange, emerging markets, higher moment models, median models, mixed-integer linear programming, non-linear programming
Procedia PDF Downloads 3151244 Machine Learning in Agriculture: A Brief Review
Authors: Aishi Kundu, Elhan Raza
Abstract:
"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting
Procedia PDF Downloads 1071243 The Role of ICTS in Improving the Quality of Public Spaces in Large Cities of the Third World
Authors: Ayat Ayman Abdelaziz Ibrahim Amayem, Hassan Abdel-Salam, Zeyad El-Sayad
Abstract:
Nowadays, ICTs have spread extensively in everyday life in an unprecedented way. A great attention is paid to the ICTs while ignoring the social aspect. With the immersive invasion of internet as well as smart phones’ applications and digital social networking, people become more socially connected through virtual spaces instead of meeting in physical public spaces. Thus, this paper aims to find the ways of implementing ICTs in public spaces to regain their status as attractive places for people, incite meetings in real life and create sustainable lively city centers. One selected example of urban space in the city center of Alexandria is selected for the study. Alexandria represents a large metropolitan city subjected to rapid transformation. Improving the quality of its public spaces will have great effects on the whole well-being of the city. The major roles that ICTs can play in the public space are: culture and art, education, planning and design, games and entertainment, and information and communication. Based on this classification various examples and proposals of ICTs interventions in public spaces are presented and analyzed to encourage good old fashioned social interaction by creating the New Social Public Place of this Digital Era. The paper will adopt methods such as questionnaire for evaluating the people’s willingness to accept the idea of using ICTs in public spaces, their needs and their proposals for an attractive place; the technique of observation to understand the people behavior and their movement through the space and finally will present an experimental design proposal for the selected urban space. Accordingly, this study will help to find design principles that can be adopted in the design of future public spaces to meet the needs of the digital era’s users with the new concepts of social life respecting the rules of place-making.Keywords: Alexandria sustainable city center, digital place-making, ICTs, social interaction, social networking, urban places
Procedia PDF Downloads 4221242 An Investigation on the Sandwich Panels with Flexible and Toughened Adhesives under Flexural Loading
Authors: Emre Kara, Şura Karakuzu, Ahmet Fatih Geylan, Metehan Demir, Kadir Koç, Halil Aykul
Abstract:
The material selection in the design of the sandwich structures is very crucial aspect because of the positive or negative influences of the base materials to the mechanical properties of the entire panel. In the literature, it was presented that the selection of the skin and core materials plays very important role on the behavior of the sandwich. Beside this, the use of the correct adhesive can make the whole structure to show better mechanical results and behavior. By this way, the sandwich structures realized in the study were obtained with the combination of aluminum foam core and three different glass fiber reinforced polymer (GFRP) skins using two different commercial adhesives which are based on flexible polyurethane and toughened epoxy. The static and dynamic tests were already applied on the sandwiches with different types of adhesives. In the present work, the static three-point bending tests were performed on the sandwiches having an aluminum foam core with the thickness of 15 mm, the skins with three different types of fabrics ([0°/90°] cross ply E-Glass Biaxial stitched, [0°/90°] cross ply E-Glass Woven and [0°/90°] cross ply S-Glass Woven which have same thickness value of 1.75 mm) and two different commercial adhesives (flexible polyurethane and toughened epoxy based) at different values of support span distances (L= 55, 70, 80, 125 mm) by aiming the analyses of their flexural performance. The skins used in the study were produced via Vacuum Assisted Resin Transfer Molding (VARTM) technique and were easily bonded onto the aluminum foam core with flexible and toughened adhesives under a very low pressure using press machine with the alignment tabs having the total thickness of the whole panel. The main results of the flexural loading are: force-displacement curves obtained after the bending tests, peak force values, absorbed energy, collapse mechanisms, adhesion quality and the effect of the support span length and adhesive type. The experimental results presented that the sandwiches with epoxy based toughened adhesive and the skins made of S-Glass Woven fabrics indicated the best adhesion quality and mechanical properties. The sandwiches with toughened adhesive exhibited higher peak force and energy absorption values compared to the sandwiches with flexible adhesive. The core shear mode occurred in the sandwiches with flexible polyurethane based adhesive through the thickness of the core while the same mode took place in the sandwiches with toughened epoxy based adhesive along the length of the core. The use of these sandwich structures can lead to a weight reduction of the transport vehicles, providing an adequate structural strength under operating conditions.Keywords: adhesive and adhesion, aluminum foam, bending, collapse mechanisms
Procedia PDF Downloads 3291241 Developing Geriatric Oral Health Network is a Public Health Necessity for Older Adults
Authors: Maryam Tabrizi, Shahrzad Aarup
Abstract:
Objectives- Understanding the close association between oral health and overall health for older adults at the right time and right place, a person, focus treatment through Project ECHO telementoring. Methodology- Data from monthly ECHO telementoring sessions were provided for three years. Sessions including case presentations, overall health conditions, considering medications, organ functions limitations, including the level of cognition. Contributions- Providing the specialist level of providing care to all elderly regardless of their location and other health conditions and decreasing oral health inequity by increasing workforce via Project ECHO telementoring program worldwide. By 2030, the number of adults in the USA over the age of 65 will increase more than 60% (approx.46 million) and over 22 million (30%) of 74 million older Americans will need specialized geriatrician care. In 2025, a national shortage of medical geriatricians will be close to 27,000. Most individuals 65 and older do not receive oral health care due to lack of access, availability, or affordability. One of the main reasons is a significant shortage of Oral Health (OH) education and resources for the elderly, particularly in rural areas. Poor OH is a social stigma, a thread to quality and safety of overall health of the elderly with physical and cognitive decline. Poor OH conditions may be costly and sometimes life-threatening. Non-traumatic dental-related emergency department use in Texas alone was over $250 M in 2016. Most elderly over the age of 65 present with at least one or multiple chronic diseases such as arthritis, diabetes, heart diseases, and chronic obstructive pulmonary disease (COPD) are at higher risk to develop gum (periodontal) disease, yet they are less likely to get dental care. In addition, most older adults take both prescription and over-the-counter drugs; according to scientific studies, many of these medications cause dry mouth. Reduced saliva flow due to aging and medications may increase the risk of cavities and other oral conditions. Most dental schools have already increased geriatrics OH in their educational curriculums, but the aging population growth worldwide is faster than growing geriatrics dentists. However, without the use of advanced technology and creating a network between specialists and primary care providers, it is impossible to increase the workforce, provide equitable oral health to the elderly. Project ECHO is a guided practice model that revolutionizes health education and increases the workforce to provide best-practice specialty care and reduce health disparities. Training oral health providers for utilizing the Project ECHO model is a logical response to the shortage and increases oral health access to the elderly. Project ECHO trains general dentists & hygienists to provide specialty care services. This means more elderly can get the care they need, in the right place, at the right time, with better treatment outcomes and reduces costs.Keywords: geriatric, oral health, project echo, chronic disease, oral health
Procedia PDF Downloads 1751240 Embracing Transculturality by Internationalising the EFL Classroom
Authors: Karen Jacob
Abstract:
Over the last decades, there has been a rise in the use of CLIL (content and language integrated learning) methodology as a way of reinforcing FL (foreign language) acquisition. CLIL techniques have also been transferred to the formal instruction-based FL classroom where through content-based lessons and project work it can very often say that teachers are ‘clilling’ in the FL classroom. When it comes to motivating students to acquire an FL, we have to take into account that English is not your run-of-the-mill FL: English is an international language (EIL). Consequently, this means that EFL students should be able to use English as an international medium of communication. This leads to the assumption that along with FL competence, speakers of EIL will need to become competent international citizens with knowledge of other societies, both contextually and geographically, and be flexible, open-minded, respectful and sensitive towards other world groups. Rather than ‘intercultural’ competence we should be referring to ‘transcultural’ competence. This paper reports the implementation of a content- and task-based approach to EFL teaching which was applied to two groups of 15 year-olds from two schools on the Spanish island of Mallorca during the school year 2015-2016. Students worked on three units of work that aimed at ‘internationalising’ the classroom by introducing topics that would encourage them to become transculturally aware of the world in which they live. In this paper we discuss the feedback given by the teachers and students on various aspects of the approach in order to answer the following research questions: 1) To what extent were the students motivated by the content and activities of the classes?; 2) Did this motivation have a positive effect on the students’ overall results for the subject; 3) Did the participants show any signs of becoming transculturally aware. Preliminary results from qualitative data show that the students enjoyed the move away from the more traditional EFL content and, as a result, they became more competent in speaking and writing. Students also appeared to become more knowledgeable and respectful towards the ‘other’. The EFL approach described in this paper takes a more qualitative approach to research by describing what is really going on in the EFL classroom and makes a conscious effort to provide real examples of not only the acquisition of linguistic competence but also the acquisition of other important communication skills that are of utmost importance in today's international arena.Keywords: CLIL, content- and task-based learning, internationalisation, transcultural competence
Procedia PDF Downloads 2421239 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images
Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir
Abstract:
The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.Keywords: altitude estimation, drone, image processing, trajectory planning
Procedia PDF Downloads 1131238 The Influence of Thermal Radiation and Chemical Reaction on MHD Micropolar Fluid in The Presence of Heat Generation/Absorption
Authors: Binyam Teferi
Abstract:
Numerical and theoretical analysis of mixed convection flow of magneto- hydrodynamics micropolar fluid with stretching capillary in the presence of thermal radiation, chemical reaction, viscous dissipation, and heat generation/ absorption have been studied. The non-linear partial differential equations of momentum, angular velocity, energy, and concentration are converted into ordinary differential equations using similarity transformations which can be solved numerically. The dimensionless governing equations are solved by using Runge Kutta fourth and fifth order along with the shooting method. The effect of physical parameters viz., micropolar parameter, unsteadiness parameter, thermal buoyancy parameter, concentration buoyancy parameter, Hartmann number, spin gradient viscosity parameter, microinertial density parameter, thermal radiation parameter, Prandtl number, Eckert number, heat generation or absorption parameter, Schmidt number and chemical reaction parameter on flow variables viz., the velocity of the micropolar fluid, microrotation, temperature, and concentration has been analyzed and discussed graphically. MATLAB code is used to analyze numerical and theoretical facts. From the simulation study, it can be concluded that an increment of micropolar parameter, Hartmann number, unsteadiness parameter, thermal and concentration buoyancy parameter results in decrement of velocity flow of micropolar fluid; microrotation of micropolar fluid decreases with an increment of micropolar parameter, unsteadiness parameter, microinertial density parameter, and spin gradient viscosity parameter; temperature profile of micropolar fluid decreases with an increment of thermal radiation parameter, Prandtl number, micropolar parameter, unsteadiness parameter, heat absorption, and viscous dissipation parameter; concentration of micropolar fluid decreases as unsteadiness parameter, Schmidt number and chemical reaction parameter increases. Furthermore, computational values of local skin friction coefficient, local wall coupled coefficient, local Nusselt number, and local Sherwood number for different values of parameters have been investigated. In this paper, the following important results are obtained; An increment of micropolar parameter and Hartmann number results in a decrement of velocity flow of micropolar fluid. Microrotation decreases with an increment of the microinertial density parameter. Temperature decreases with an increasing value of the thermal radiation parameter and viscous dissipation parameter. Concentration decreases as the values of Schmidt number and chemical reaction parameter increases. The coefficient of local skin friction is enhanced with an increase in values of both the unsteadiness parameter and micropolar parameter. Increasing values of unsteadiness parameter and micropolar parameter results in an increment of the local couple stress. An increment of values of unsteadiness parameter and thermal radiation parameter results in an increment of the rate of heat transfer. As the values of Schmidt number and unsteadiness parameter increases, Sherwood number decreases.Keywords: thermal radiation, chemical reaction, viscous dissipation, heat absorption/ generation, similarity transformation
Procedia PDF Downloads 1301237 Coronin 1C and miR-128A as Potential Diagnostic Biomarkers for Glioblastoma Multiform
Authors: Denis Mustafov, Emmanouil Karteris, Maria Braoudaki
Abstract:
Glioblastoma multiform (GBM) is a heterogenous primary brain tumour that kills most affected patients. To the authors best knowledge, despite all research efforts there is no early diagnostic biomarker for GBM. MicroRNAs (miRNAs) are short non-coding RNA molecules which are deregulated in many cancers. The aim of this research was to determine miRNAs with a diagnostic impact and to potentially identify promising therapeutic targets for glioblastoma multiform. In silico analysis was performed to identify deregulated miRNAs with diagnostic relevance for glioblastoma. The expression profiles of the chosen miRNAs were then validated in vitro in the human glioblastoma cell lines A172 and U-87MG. Briefly, RNA extraction was carried out using the Trizol method, whilst miRNA extraction was performed using the mirVANA miRNA isolation kit. Quantitative Real-Time Polymerase Chain Reaction was performed to verify their expression. The presence of five target proteins within the A172 cell line was evaluated by Western blotting. The expression of the CORO1C protein within 32 GBM cases was examined via immunohistochemistry. The miRNAs identified in silico included miR-21-5p, miR-34a and miR-128a. These miRNAs were shown to target deregulated GBM genes, such as CDK6, E2F3, BMI1, JAG1, and CORO1C. miR-34a and miR-128a showed low expression profiles in comparison to a control miR-RNU-44 in both GBM cell lines suggesting tumour suppressor properties. Opposing, miR-21-5p demonstrated greater expression indicating that it could potentially function as an oncomiR. Western blotting revealed expression of all five proteins within the A172 cell line. In silico analysis also suggested that CORO1C is a target of miR-128a and miR-34a. Immunohistochemistry demonstrated that 75% of the GBM cases showed moderate to high expression of CORO1C protein. Greater understanding of the deregulated expression of miR-128a and the upregulation of CORO1C in GBM could potentially lead to the identification of a promising diagnostic biomarker signature for glioblastomas.Keywords: non-coding RNAs, gene expression, brain tumours, immunohistochemistry
Procedia PDF Downloads 911236 Hand Gesture Detection via EmguCV Canny Pruning
Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae
Abstract:
Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.Keywords: canny pruning, hand recognition, machine learning, skin tracking
Procedia PDF Downloads 1851235 Dynamic Mechanical Analysis of Supercooled Water in Nanoporous Confinement and Biological Systems
Authors: Viktor Soprunyuk, Wilfried Schranz, Patrick Huber
Abstract:
In the present work, we show that Dynamic Mechanical Analysis (DMA) with a measurement frequency range f= 0.2 - 100 Hz is a rather powerful technique for the study of phase transitions (freezing and melting) and glass transitions of water in geometrical confinement. Inserting water into nanoporous host matrices, like e.g. Gelsil (size of pores 2.6 nm and 5 nm) or Vycor (size of pores 10 nm) allows one to study size effects occurring at the nanoscale conveniently in macroscopic bulk samples. One obtains valuable insight concerning confinement induced changes of the dynamics by measuring the temperature and frequency dependencies of the complex Young's modulus Y* for various pore sizes. Solid-liquid transitions or glass-liquid transitions show up in a softening or the real part Y' of the complex Young's modulus, yet with completely different frequency dependencies. Analysing the frequency dependent imaginary part of the Young´s modulus in the glass transition regions for different pore sizes we find a clear-cut 1/d-dependence of the calculated glass transition temperatures which extrapolates to Tg(1/d=0)=136 K, in agreement with the traditional value of water. The results indicate that the main role of the pore diameter is the relative amount of water molecules that are near an interface within a length scale of the order of the dynamic correlation length x. Thus we argue that the observed strong pore size dependence of Tg is an interfacial effect, rather than a finite size effect. We obtained similar signatures of Y* near glass transitions in different biological objects (fruits, vegetables, and bread). The values of the activation energies for these biological materials in the region of glass transition are quite similar to the values of the activation energies of supercooled water in the nanoporous confinement in this region. The present work was supported by the Austrian Science Fund (FWF, project Nr. P 28672 – N36).Keywords: biological systems, liquids, glasses, amorphous systems, nanoporous materials, phase transition
Procedia PDF Downloads 2401234 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods
Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie
Abstract:
The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence
Procedia PDF Downloads 2481233 How to “Eat” without Actually Eating: Marking Metaphor with Spanish Se and Italian Si
Authors: Cinzia Russi, Chiyo Nishida
Abstract:
Using data from online corpora (Spanish CREA, Italian CORIS), this paper examines the relatively understudied use of Spanish se and Italian si exemplified in (1) and (2), respectively. (1) El rojo es … el que se come a los demás. ‘The red (bottle) is the one that outshines/*eats the rest.’(2) … ebbe anche la saggezza di mangiarsi tutto il suo patrimonio. ‘… he even had the wisdom to squander/*eat all his estate.’ In these sentences, se/si accompanies the consumption verb comer/mangiare ‘to eat’, without which the sentences would not be interpreted appropriately. This se/si cannot readily be attributed to any of the multiple functions so far identified in the literature: reflexive, ergative, middle/passive, inherent, benefactive, and complete consumptive. In particular, this paper argues against the feasibility of a recent construction-based analysis of sentences like (1) and (2), which situates se/si within a prototype-based network of meanings all deriving from the central meaning of 'COMPLETE CONSUMPTION' (e.g., Alice se comió toda la torta/Alicesi è mangiata tutta la torta ‘John ate the whole cake’). Clearly, the empirical adequacy of such an account is undermined by the fact that the events depicted in the se/si-sentences at issue do not always entail complete consumption because they may lack an INCREMENTAL THEME, the distinguishing property of complete consumption. Alternatively, it is proposed that the sentences under analysis represent instances of verbal METAPHORICAL EXTENSION: se/si represents an explicit marker of this cognitive process, which has independently developed from the complete consumptive se/si, and the meaning extension is captured by the general tenets of Conceptual Metaphor Theory (CMT). Two conceptual domains, Source (DS) and target (DT), are related by similarity, assigning an appropriate metaphorical interpretation to DT. The domains paired here are comer/mangiare (DS) and comerse/mangiarsi (DT). The eating event (DS) involves (a) the physical process of xEATER grinding yFOOD-STUFF into pieces and swallowing it; and (b) the aspect of xEATER savoring yFOOD-STUFF and being nurtured by it. In the physical act of eating, xEATER has dominance and exercises his force over yFOOD-STUFF. This general sense of dominance and force is mapped onto DT and is manifested in the ways exemplified in (1) and (2), and many others. According to CMT, two other properties are observed in each pair of DS & DT. First, DS tends to be more physical and concrete and DT more abstract, and systematic mappings are established between constituent elements in DS and those in DT: xEATER corresponds to the element that destroys and yFOOD-STUFF to the element that is destroyed in DT, as exemplified in (1) and (2). Though the metaphorical extension marker se/si appears by far most frequently with comer/mangiare in the corpora, similar systematic mappings are observed in several other verb pairs, for example, jugar/giocare ‘to play (games)’ and jugarse/giocarsi ‘to jeopardize/risk (life, reputation, etc.)’, perder/perdere ‘to lose (an object)’ and perderse/perdersi ‘to miss out on (an event)’, etc. Thus, this study provides evidence that languages may indeed formally mark metaphor using means available to them.Keywords: complete consumption value, conceptual metaphor, Italian si/Spanish se, metaphorical extension.
Procedia PDF Downloads 541232 Allylation of Active Methylene Compounds with Cyclic Baylis-Hillman Alcohols: Why Is It Direct and Not Conjugate?
Authors: Karim Hrratha, Khaled Essalahb, Christophe Morellc, Henry Chermettec, Salima Boughdiria
Abstract:
Among the carbon-carbon bond formation types, allylation of active methylene compounds with cyclic Baylis-Hillman (BH) alcohols is a reliable and widely used method. This reaction is a very attractive tool in organic synthesis of biological and biodiesel compounds. Thus, in view of an insistent and peremptory request for an efficient and straightly method for synthesizing the desired product, a thorough analysis of various aspects of the reaction processes is an important task. The product afforded by the reaction of active methylene with BH alcohols depends largely on the experimental conditions, notably on the catalyst properties. All experiments reported that catalysis is needed for this reaction type because of the poor ability of alcohol hydroxyl group to be as a suitable leaving group. Within the catalysts, several transition- metal based have been used such as palladium in the presence of acid or base and have been considered as reliable methods. Furthemore, acid catalysts such as BF3.OEt2, BiX3 (X= Cl, Br, I, (OTf)3), InCl3, Yb(OTf)3, FeCl3, p-TsOH and H-montmorillonite have been employed to activate the C-C bond formation through the alkylation of active methylene compounds. Interestingly a report of a smoothly process for the ability of 4-imethyaminopyridine(DMAP) to catalyze the allylation reaction of active methylene compounds with cyclic Baylis-Hillman (BH) alcohol appeared recently. However, the reaction mechanism remains ambiguous, since the C- allylation process leads to an unexpected product (noted P1), corresponding to a direct allylation instead of conjugate allylation, which involves the most electrophilic center according to the electron withdrawing group CO effect. The main objective of the present theoretical study is to better understand the role of the DMAP catalytic activity as well as the process leading to the end- product (P1) for the catalytic reaction of a cyclic BH alcohol with active methylene compounds. For that purpose, we have carried out computations of a set of active methylene compounds varying by R1 and R2 toward the same alcohol, and we have attempted to rationalize the mechanisms thanks to the acid–base approach, and conceptual DFT tools such as chemical potential, hardness, Fukui functions, electrophilicity index and dual descriptor, as these approaches have shown a good prediction of reactions products.The present work is then organized as follows: In a first part some computational details will be given, introducing the reactivity indexes used in the present work, then Section 3 is dedicated to the discussion of the prediction of the selectivity and regioselectivity. The paper ends with some concluding remarks. In this work, we have shown, through DFT method at the B3LYP/6-311++G(d,p) level of theory that: The allylation of active methylene compounds with cyclic BH alcohol is governed by orbital control character. Hence the end- product denoted P1 is generated by direct allylation.Keywords: DFT calculation, gas phase pKa, theoretical mechanism, orbital control, charge control, Fukui function, transition state
Procedia PDF Downloads 3071231 Environmental Impact of a New-Build Educational Building in England: Life-Cycle Assessment as a Method to Calculate Whole Life Carbon Emissions
Authors: Monkiz Khasreen
Abstract:
In the context of the global trend towards reducing new buildings carbon footprint, the design team is required to make early decisions that have a major influence on embodied and operational carbon. Sustainability strategies should be clear during early stages of building design process, as changes made later can be extremely costly. Life-Cycle Assessment (LCA) could be used as the vehicle to carry other tools and processes towards achieving the requested improvement. Although LCA is the ‘golden standard’ to evaluate buildings from 'cradle to grave', lack of details available on the concept design makes LCA very difficult, if not impossible, to be used as an estimation tool at early stages. Issues related to transparency and accessibility of information in the building industry are affecting the credibility of LCA studies. A verified database derived from LCA case studies is required to be accessible to researchers, design professionals, and decision makers in order to offer guidance on specific areas of significant impact. This database could be the build-up of data from multiple sources within a pool of research held in this context. One of the most important factors that affects the reliability of such data is the temporal factor as building materials, components, and systems are rapidly changing with the advancement of technology making production more efficient and less environmentally harmful. Recent LCA studies on different building functions, types, and structures are always needed to update databases derived from research and to form case bases for comparison studies. There is also a need to make these studies transparent and accessible to designers. The work in this paper sets out to address this need. This paper also presents life-cycle case study of a new-build educational building in England. The building utilised very current construction methods and technologies and is rated as BREEAM excellent. Carbon emissions of different life-cycle stages and different building materials and components were modelled. Scenario and sensitivity analyses were used to estimate the future of new educational buildings in England. The study attempts to form an indicator during the early design stages of similar buildings. Carbon dioxide emissions of this case study building, when normalised according to floor area, lie towards the lower end of the range of worldwide data reported in the literature. Sensitivity analysis shows that life cycle assessment results are highly sensitive to future assumptions made at the design stage, such as future changes in electricity generation structure over time, refurbishment processes and recycling. The analyses also prove that large savings in carbon dioxide emissions can result from very small changes at the design stage.Keywords: architecture, building, carbon dioxide, construction, educational buildings, England, environmental impact, life-cycle assessment
Procedia PDF Downloads 1131230 Assessment of Obesity Parameters in Terms of Metabolic Age above and below Chronological Age in Adults
Authors: Orkide Donma, Mustafa M. Donma
Abstract:
Chronologic age (CA) of individuals is closely related to obesity and generally affects the magnitude of obesity parameters. On the other hand, close association between basal metabolic rate (BMR) and metabolic age (MA) is also a matter of concern. It is suggested that MA higher than CA is the indicator of the need to improve the metabolic rate. In this study, the aim was to assess some commonly used obesity parameters, such as obesity degree, visceral adiposity, BMR, BMR-to-weight ratio, in several groups with varying differences between MA and CA values. The study comprises adults, whose ages vary between 18 and 79 years. Four groups were constituted. Group 1, 2, 3 and 4 were composed of 55, 33, 76 and 47 adults, respectively. The individuals exhibiting -1, 0 and +1 for their MA-CA values were involved in Group 1, which was considered as the control group. Those, whose MA-CA values varying between -5 and -10 participated in Group 2. Those, whose MAs above their real ages were divided into two groups [Group 3 (MA-CA; from +5 to + 10) and Group 4 (MA-CA; from +11 to + 12)]. Body mass index (BMI) values were calculated. TANITA body composition monitor using bioelectrical impedance analysis technology was used to obtain values for obesity degree, visceral adiposity, BMR and BMR-to-weight ratio. The compiled data were evaluated statistically using a statistical package program; SPSS. Mean ± SD values were determined. Correlation analyses were performed. The statistical significance degree was accepted as p < 0.05. The increase in BMR was positively correlated with obesity degree. MAs and CAs of the groups were 39.9 ± 16.8 vs 39.9 ± 16.7 years for Group 1, 45.0 ± 15.3 vs 51.4 ± 15.7 years for Group 2, 47.2 ± 12.7 vs 40.0 ± 12.7 years for Group 3, and 53.6 ± 14.8 vs 42 ± 14.8 years for Group 4. BMI values of the groups were 24.3 ± 3.6 kg/m2, 23.2 ± 1.7 kg/m2, 30.3 ± 3.8 kg/m2, and 40.1 ± 5.1 kg/m2 for Group 1, 2, 3 and 4, respectively. Values obtained for BMR were 1599 ± 328 kcal in Group 1, 1463 ± 198 kcal in Group 2, 1652 ± 350 kcal in Group 3, and 1890 ± 360 kcal in Group 4. A correlation was observed between BMR and MA-CA values in Group 1. No correlation was detected in other groups. On the other hand, statistically significant correlations between MA-CA values and obesity degree, BMI as well as BMR/weight were found in Group 3 and in Group 4. It was concluded that upon consideration of these findings in terms of MA-CA values, BMR-to-weight ratio was found to be much more useful indicator of the severe increase in obesity development than BMR. Also, the lack of associations between MA and BMR as well as BMR-to-weight ratio emphasize the importance of consideration of MA-CA values rather than MA.Keywords: basal metabolic rate, basal metabolic rate-to-weight-ratio, chronologic age, metabolic age, obesity degree
Procedia PDF Downloads 981229 Mathematics Bridging Theory and Applications for a Data-Driven World
Authors: Zahid Ullah, Atlas Khan
Abstract:
In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models
Procedia PDF Downloads 771228 Hygro-Thermal Modelling of Timber Decks
Authors: Stefania Fortino, Petr Hradil, Timo Avikainen
Abstract:
Timber bridges have an excellent environmental performance, are economical, relatively easy to build and can have a long service life. However, the durability of these bridges is the main problem because of their exposure to outdoor climate conditions. The moisture content accumulated in wood for long periods, in combination with certain temperatures, may cause conditions suitable for timber decay. In addition, moisture content variations affect the structural integrity, serviceability and loading capacity of timber bridges. Therefore, the monitoring of the moisture content in wood is important for the durability of the material but also for the whole superstructure. The measurements obtained by the usual sensor-based techniques provide hygro-thermal data only in specific locations of the wood components. In this context, the monitoring can be assisted by numerical modelling to get more information on the hygro-thermal response of the bridges. This work presents a hygro-thermal model based on a multi-phase moisture transport theory to predict the distribution of moisture content, relative humidity and temperature in wood. Below the fibre saturation point, the multi-phase theory simulates three phenomena in cellular wood during moisture transfer, i.e., the diffusion of water vapour in the pores, the sorption of bound water and the diffusion of bound water in the cell walls. In the multi-phase model, the two water phases are separated, and the coupling between them is defined through a sorption rate. Furthermore, an average between the temperature-dependent adsorption and desorption isotherms is used. In previous works by some of the authors, this approach was found very suitable to study the moisture transport in uncoated and coated stress-laminated timber decks. Compared to previous works, the hygro-thermal fluxes on the external surfaces include the influence of the absorbed solar radiation during the time and consequently, the temperatures on the surfaces exposed to the sun are higher. This affects the whole hygro-thermal response of the timber component. The multi-phase model, implemented in a user subroutine of Abaqus FEM code, provides the distribution of the moisture content, the temperature and the relative humidity in a volume of the timber deck. As a case study, the hygro-thermal data in wood are collected from the ongoing monitoring of the stress-laminated timber deck of Tapiola Bridge in Finland, based on integrated humidity-temperature sensors and the numerical results are found in good agreement with the measurements. The proposed model, used to assist the monitoring, can contribute to reducing the maintenance costs of bridges, as well as the cost of instrumentation, and increase safety.Keywords: moisture content, multi-phase models, solar radiation, timber decks, FEM
Procedia PDF Downloads 1761227 Ecological and Historical Components of the Cultural Code of the City of Florence as Part of the Edutainment Project Velonotte International
Authors: Natalia Zhabo, Sergey Nikitin, Marina Avdonina, Mariya Nikitina
Abstract:
The analysis of the activities of one of the events of the international educational and entertainment project Velonotte is provided: an evening bicycle tour with children around Florence. The aim of the project is to develop methods and techniques for increasing the sensitivity of the cycling participants and listeners of the radio broadcasts to the treasures of the national heritage, in this case, to the historical layers of the city and the ecology of the Renaissance epoch. The block of educational tasks is considered, and the issues of preserving the identity of the city are discussed. Methods. The Florentine event was prepared during more than a year. First of all the creative team selected such events of the history of the city which seem to be important for revealing the specifics of the city, its spirit - from antiquity to our days – including the forums of Internet with broad public opinion. Then a route (seven kilometers) was developed, which was proposed to the authorities and organizations of the city. The selection of speakers was conducted according to several criteria: they should be authors of books, famous scientists, connoisseurs in a certain sphere (toponymy, history of urban gardens, art history), capable and willing to talk with participants directly at the points of stops, in order to make a dialogue and so that performances could be organized with their participation. The music was chosen for each part of the itinerary to prepare the audience emotionally. Cards for coloring with images of the main content of each stop were created for children. A site was done to inform the participants and to keep photos, videos and the audio files with speakers’ speech afterward. Results: Held in April 2017, the event was dedicated to the 640th Anniversary of the Filippo Brunelleschi, Florentine architect, and to the 190th anniversary of the publication of Florence guide by Stendhal. It was supported by City of Florence and Florence Bike Festival. Florence was explored to transfer traditional elements of culture, sometimes unfairly forgotten from ancient times to Brunelleschi and Michelangelo and Tschaikovsky and David Bowie with lectures by professors of Universities. Memorable art boards were installed in public spaces. Elements of the cultural code are deeply internalized in the minds of the townspeople, the perception of the city in everyday life and human communication is comparable to such fundamental concepts of the self-awareness of the townspeople as mental comfort and the level of happiness. The format of a fun and playful walk with the ICT support gives new opportunities for enriching the city's cultural code of each citizen with new components, associations, connotations.Keywords: edutainment, cultural code, cycling, sensitization Florence
Procedia PDF Downloads 2211226 Design of a Mhealth Therapy Management to Maintain Therapy Outcomes after Bariatric Surgery
Authors: A. Dudek, P. Tylec, G. Torbicz, P. Duda, K. Proniewska, P. Major, M. Pedziwiatr
Abstract:
Background: Conservative treatments of obesity, based only on a proper diet and physical activity, without the support of an interdisciplinary team of specialist does not bring satisfactory bariatric results. Long-term maintenance of a proper metabolic results after rapid weight loss due to bariatric surgery requires engagement from patients. Mobile health tool may offer alternative model that enhance participant engagement in keeping the therapy. Objective: We aimed to assess the influence of constant monitoring and subsequent motivational alerts in perioperative period and on post-operative effects in the bariatric patients. As well as the study was designed to identify factors conductive urge to change lifestyle after surgery. Methods: This prospective clinical control study was based on a usage of a designed prototype of bariatric mHealth system. The prepared application comprises central data management with a comprehensible interface dedicated for patients and data transfer module as a physician’s platform. Motivation system of a platform consist of motivational alerts, graphic outcome presentation, and patient communication center. Generated list of patients requiring urgent consultation and possibility of a constant contact with a specialist provide safety zone. 31 patients were enrolled in continuous monitoring program during a 6-month period along with typical follow-up visits. After one year follow-up, all patients were examined. Results: There were 20 active users of the proposed monitoring system during the entire duration of the study. After six months, 24 patients took a part in a control by telephone questionnaires. Among them, 75% confirmed that the application concept was an important element in the treatment. Active users of the application indicated as the most valuable features: motivation to continue treatment (11 users), graphical presentation of weight loss, and other parameters (7 users), the ability to contact a doctor (3 users). The three main drawbacks are technical errors (9 users), tedious questionnaires inside the application (5 users), and time-consuming tasks inside the system (2 users). Conclusions: Constant monitoring and successive motivational alerts to continue treatment is an appropriate tool in the treatment after bariatric surgery, mainly in the early post-operative period. Graphic presentation of data and continuous connection with a clinical staff seemed to be an element of motivation to continue treatment and a sense of security.Keywords: bariatric surgery, mHealth, mobile health tool, obesity
Procedia PDF Downloads 1131225 Computation of Radiotherapy Treatment Plans Based on CT to ED Conversion Curves
Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić
Abstract:
Radiotherapy treatment planning computers use CT data of the patient. For the computation of a treatment plan, treatment planning system must have an information on electron densities of tissues scanned by CT. This information is given by the conversion curve CT (CT number) to ED (electron density), or simply calibration curve. Every treatment planning system (TPS) has built in default CT to ED conversion curves, for the CTs of different manufacturers. However, it is always recommended to verify the CT to ED conversion curve before actual clinical use. Objective of this study was to check how the default curve already provided matches the curve actually measured on a specific CT, and how much it influences the calculation of a treatment planning computer. The examined CT scanners were from the same manufacturer, but four different scanners from three generations. The measurements of all calibration curves were done with the dedicated phantom CIRS 062M Electron Density Phantom. The phantom was scanned, and according to real HU values read at the CT console computer, CT to ED conversion curves were generated for different materials, for same tube voltage 140 kV. Another phantom, CIRS Thorax 002 LFC which represents an average human torso in proportion, density and two-dimensional structure, was used for verification. The treatment planning was done on CT slices of scanned CIRS LFC 002 phantom, for selected cases. Interest points were set in the lungs, and in the spinal cord, and doses recorded in TPS. The overall calculated treatment times for four scanners and default scanner did not differ more than 0.8%. Overall interest point dose in bone differed max 0.6% while for single fields was maximum 2.7% (lateral field). Overall interest point dose in lungs differed max 1.1% while for single fields was maximum 2.6% (lateral field). It is known that user should verify the CT to ED conversion curve, but often, developing countries are facing lack of QA equipment, and often use default data provided. We have concluded that the CT to ED curves obtained differ in certain points of a curve, generally in the region of higher densities. This influences the treatment planning result which is not significant, but definitely does make difference in the calculated dose.Keywords: Computation of treatment plan, conversion curve, radiotherapy, electron density
Procedia PDF Downloads 4901224 Enhancing Healthcare Delivery in Low-Income Markets: An Exploration of Wireless Sensor Network Applications
Authors: Innocent Uzougbo Onwuegbuzie
Abstract:
Healthcare delivery in low-income markets is fraught with numerous challenges, including limited access to essential medical resources, inadequate healthcare infrastructure, and a significant shortage of trained healthcare professionals. These constraints lead to suboptimal health outcomes and a higher incidence of preventable diseases. This paper explores the application of Wireless Sensor Networks (WSNs) as a transformative solution to enhance healthcare delivery in these underserved regions. WSNs, comprising spatially distributed sensor nodes that collect and transmit health-related data, present opportunities to address critical healthcare needs. Leveraging WSN technology facilitates real-time health monitoring and remote diagnostics, enabling continuous patient observation and early detection of medical issues, especially in areas with limited healthcare facilities and professionals. The implementation of WSNs can enhance the overall efficiency of healthcare systems by enabling timely interventions, reducing the strain on healthcare facilities, and optimizing resource allocation. This paper highlights the potential benefits of WSNs in low-income markets, such as cost-effectiveness, increased accessibility, and data-driven decision-making. However, deploying WSNs involves significant challenges, including technical barriers like limited internet connectivity and power supply, alongside concerns about data privacy and security. Moreover, robust infrastructure and adequate training for local healthcare providers are essential for successful implementation. It further examines future directions for WSNs, emphasizing innovation, scalable solutions, and public-private partnerships. By addressing these challenges and harnessing the potential of WSNs, it is possible to revolutionize healthcare delivery and improve health outcomes in low-income markets.Keywords: wireless sensor networks (WSNs), healthcare delivery, low-Income markets, remote patient monitoring, health data security
Procedia PDF Downloads 391223 Local Farmer’s Perception on the Role of Room for the River in Livelihoods: Case Study in An Phu District, An Giang Province, Vietnam
Authors: Hoang Vo Thi Minh, Duyen Nguyen Thi Phuong, Gerardo Van Halsema
Abstract:
As one of the deltas which is extremely vulnerable to climate change, the Mekong Delta, Vietnam is facing many challenges that need to be addressed in strategic and holistic ways. In this study scope, a strategic delta planning is recently considered as a new vision of Adaptive Delta Management for the Mekong Delta. In Adaptive Delta Management, Room for the Rivers (RftR) has been formulated as a typical innovation, which is currently in need of careful consideration for implementing in the Mekong Delta’s planning process. This study then attempts to investigate the roles and analyze sociological aspects of the RftR as potential strategic 'soft' measure, in upstream of Hau River in An Phu district, An Giang province, especially in terms of its so-called multifunctions. The study applied social science approach embedded with a few qualitative methods including in-depth interviews and questionnaire distribution and conjoint analysis as a quantitative approach. The former mainly aims at gaining the local community’s perceptions about the RftR solution. The latter tries to gain farmers’ willingness to accept (WTA) with regard to their level of preference towards the three selected solutions which are considered as strategic plans for sustainably developing the MD. Qualitative data analysis shows that, farmers perceive RftR as very useful for their livelihoods due to its multifunctions as well as in terms of water management. The quantitative results illustrated that respondents expressed their WTAs on RftR as 84. 240 thousand VND / year. Amongst the three solutions that are analysed within this study (Floating rice for upper delta, Room for the Rivers for the Middle, and Shrimp-Mangrove integration for the coastal delta), RfrR was ranked as second preference from respondents. This result is not exactly reflecting the real values of these three mentioned solutions but showing a tendency that can be seen as a reference for the decision-makers in delta planning processes.Keywords: strategic delta planning, room for the River, farmers’ perception, willingness-to-accept, local livelihoods
Procedia PDF Downloads 2281222 Improving Binding Selectivity in Molecularly Imprinted Polymers from Templates of Higher Biomolecular Weight: An Application in Cancer Targeting and Drug Delivery
Authors: Ben Otange, Wolfgang Parak, Florian Schulz, Michael Alexander Rubhausen
Abstract:
The feasibility of extending the usage of molecular imprinting technique in complex biomolecules is demonstrated in this research. This technique is promising in diverse applications in areas such as drug delivery, diagnosis of diseases, catalysts, and impurities detection as well as treatment of various complications. While molecularly imprinted polymers MIP remain robust in the synthesis of molecules with remarkable binding sites that have high affinities to specific molecules of interest, extending the usage to complex biomolecules remains futile. This work reports on the successful synthesis of MIP from complex proteins: BSA, Transferrin, and MUC1. We show in this research that despite the heterogeneous binding sites and higher conformational flexibility of the chosen proteins, relying on their respective epitopes and motifs rather than the whole template produces highly sensitive and selective MIPs for specific molecular binding. Introduction: Proteins are vital in most biological processes, ranging from cell structure and structural integrity to complex functions such as transport and immunity in biological systems. Unlike other imprinting templates, proteins have heterogeneous binding sites in their complex long-chain structure, which makes their imprinting to be marred by challenges. In addressing this challenge, our attention is inclined toward the targeted delivery, which will use molecular imprinting on the particle surface so that these particles may recognize overexpressed proteins on the target cells. Our goal is thus to make surfaces of nanoparticles that specifically bind to the target cells. Results and Discussions: Using epitopes of BSA and MUC1 proteins and motifs with conserved receptors of transferrin as the respective templates for MIPs, significant improvement in the MIP sensitivity to the binding of complex protein templates was noted. Through the Fluorescence Correlation Spectroscopy FCS measurements on the size of protein corona after incubation of the synthesized nanoparticles with proteins, we noted a high affinity of MIPs to the binding of their respective complex proteins. In addition, quantitative analysis of hard corona using SDS-PAGE showed that only a specific protein was strongly bound on the respective MIPs when incubated with similar concentrations of the protein mixture. Conclusion: Our findings have shown that the merits of MIPs can be extended to complex molecules of higher biomolecular mass. As such, the unique merits of the technique, including high sensitivity and selectivity, relative ease of synthesis, production of materials with higher physical robustness, and higher stability, can be extended to more templates that were previously not suitable candidates despite their abundance and usage within the body.Keywords: molecularly imprinted polymers, specific binding, drug delivery, high biomolecular mass-templates
Procedia PDF Downloads 55