Search results for: ML techniques
5386 [Keynote Talk]: Discovering Liouville-Type Problems for p-Energy Minimizing Maps in Closed Half-Ellipsoids by Calculus Variation Method
Authors: Lina Wu, Jia Liu, Ye Li
Abstract:
The goal of this project is to investigate constant properties (called the Liouville-type Problem) for a p-stable map as a local or global minimum of a p-energy functional where the domain is a Euclidean space and the target space is a closed half-ellipsoid. The First and Second Variation Formulas for a p-energy functional has been applied in the Calculus Variation Method as computation techniques. Stokes’ Theorem, Cauchy-Schwarz Inequality, Hardy-Sobolev type Inequalities, and the Bochner Formula as estimation techniques have been used to estimate the lower bound and the upper bound of the derived p-Harmonic Stability Inequality. One challenging point in this project is to construct a family of variation maps such that the images of variation maps must be guaranteed in a closed half-ellipsoid. The other challenging point is to find a contradiction between the lower bound and the upper bound in an analysis of p-Harmonic Stability Inequality when a p-energy minimizing map is not constant. Therefore, the possibility of a non-constant p-energy minimizing map has been ruled out and the constant property for a p-energy minimizing map has been obtained. Our research finding is to explore the constant property for a p-stable map from a Euclidean space into a closed half-ellipsoid in a certain range of p. The certain range of p is determined by the dimension values of a Euclidean space (the domain) and an ellipsoid (the target space). The certain range of p is also bounded by the curvature values on an ellipsoid (that is, the ratio of the longest axis to the shortest axis). Regarding Liouville-type results for a p-stable map, our research finding on an ellipsoid is a generalization of mathematicians’ results on a sphere. Our result is also an extension of mathematicians’ Liouville-type results from a special ellipsoid with only one parameter to any ellipsoid with (n+1) parameters in the general setting.Keywords: Bochner formula, Calculus Stokes' Theorem, Cauchy-Schwarz Inequality, first and second variation formulas, Liouville-type problem, p-harmonic map
Procedia PDF Downloads 2735385 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 3445384 Adaptive Beamforming with Steering Error and Mutual Coupling between Antenna Sensors
Authors: Ju-Hong Lee, Ching-Wei Liao
Abstract:
Owing to close antenna spacing between antenna sensors within a compact space, a part of data in one antenna sensor would outflow to other antenna sensors when the antenna sensors in an antenna array operate simultaneously. This phenomenon is called mutual coupling effect (MCE). It has been shown that the performance of antenna array systems can be degraded when the antenna sensors are in close proximity. Especially, in a systems equipped with massive antenna sensors, the degradation of beamforming performance due to the MCE is significantly inevitable. Moreover, it has been shown that even a small angle error between the true direction angle of the desired signal and the steering angle deteriorates the effectiveness of an array beamforming system. However, the true direction vector of the desired signal may not be exactly known in some applications, e.g., the application in land mobile-cellular wireless systems. Therefore, it is worth developing robust techniques to deal with the problem due to the MCE and steering angle error for array beamforming systems. In this paper, we present an efficient technique for performing adaptive beamforming with robust capabilities against the MCE and the steering angle error. Only the data vector received by an antenna array is required by the proposed technique. By using the received array data vector, a correlation matrix is constructed to replace the original correlation matrix associated with the received array data vector. Then, the mutual coupling matrix due to the MCE on the antenna array is estimated through a recursive algorithm. An appropriate estimate of the direction angle of the desired signal can also be obtained during the recursive process. Based on the estimated mutual coupling matrix, the estimated direction angle, and the reconstructed correlation matrix, the proposed technique can effectively cure the performance degradation due to steering angle error and MCE. The novelty of the proposed technique is that the implementation procedure is very simple and the resulting adaptive beamforming performance is satisfactory. Simulation results show that the proposed technique provides much better beamforming performance without requiring complicated complexity as compared with the existing robust techniques.Keywords: adaptive beamforming, mutual coupling effect, recursive algorithm, steering angle error
Procedia PDF Downloads 3205383 Algorithm for Improved Tree Counting and Detection through Adaptive Machine Learning Approach with the Integration of Watershed Transformation and Local Maxima Analysis
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
The Philippines is long considered as a valuable producer of high value crops globally. The country’s employment and economy have been dependent on agriculture, thus increasing its demand for the efficient agricultural mechanism. Remote sensing and geographic information technology have proven to effectively provide applications for precision agriculture through image-processing technique considering the development of the aerial scanning technology in the country. Accurate information concerning the spatial correlation within the field is very important for precision farming of high value crops, especially. The availability of height information and high spatial resolution images obtained from aerial scanning together with the development of new image analysis methods are offering relevant influence to precision agriculture techniques and applications. In this study, an algorithm was developed and implemented to detect and count high value crops simultaneously through adaptive scaling of support vector machine (SVM) algorithm subjected to object-oriented approach combining watershed transformation and local maxima filter in enhancing tree counting and detection. The methodology is compared to cutting-edge template matching algorithm procedures to demonstrate its effectiveness on a demanding tree is counting recognition and delineation problem. Since common data and image processing techniques are utilized, thus can be easily implemented in production processes to cover large agricultural areas. The algorithm is tested on high value crops like Palm, Mango and Coconut located in Misamis Oriental, Philippines - showing a good performance in particular for young adult and adult trees, significantly 90% above. The s inventories or database updating, allowing for the reduction of field work and manual interpretation tasks.Keywords: high value crop, LiDAR, OBIA, precision agriculture
Procedia PDF Downloads 3985382 Comparison of Machine Learning-Based Models for Predicting Streptococcus pyogenes Virulence Factors and Antimicrobial Resistance
Authors: Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Diego Santibañez Oyarce, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
Streptococcus pyogenes is a gram-positive bacteria involved in a wide range of diseases and is a major-human-specific bacterial pathogen. In Chile, this year the 'Ministerio de Salud' declared an alert due to the increase in strains throughout the year. This increase can be attributed to the multitude of factors including antimicrobial resistance (AMR) and Virulence Factors (VF). Understanding these VF and AMR is crucial for developing effective strategies and improving public health responses. Moreover, experimental identification and characterization of these pathogenic mechanisms are labor-intensive and time-consuming. Therefore, new computational methods are required to provide robust techniques for accelerating this identification. Advances in Machine Learning (ML) algorithms represent the opportunity to refine and accelerate the discovery of VF associated with Streptococcus pyogenes. In this work, we evaluate the accuracy of various machine learning models in predicting the virulence factors and antimicrobial resistance of Streptococcus pyogenes, with the objective of providing new methods for identifying the pathogenic mechanisms of this organism.Our comprehensive approach involved the download of 32,798 genbank files of S. pyogenes from NCBI dataset, coupled with the incorporation of data from Virulence Factor Database (VFDB) and Antibiotic Resistance Database (CARD) which contains sequences of AMR gene sequence and resistance profiles. These datasets provided labeled examples of both virulent and non-virulent genes, enabling a robust foundation for feature extraction and model training. We employed preprocessing, characterization and feature extraction techniques on primary nucleotide/amino acid sequences and selected the optimal more for model training. The feature set was constructed using sequence-based descriptors (e.g., k-mers and One-hot encoding), and functional annotations based on database prediction. The ML models compared are logistic regression, decision trees, support vector machines, neural networks among others. The results of this work show some differences in accuracy between the algorithms, these differences allow us to identify different aspects that represent unique opportunities for a more precise and efficient characterization and identification of VF and AMR. This comparative analysis underscores the value of integrating machine learning techniques in predicting S. pyogenes virulence and AMR, offering potential pathways for more effective diagnostic and therapeutic strategies. Future work will focus on incorporating additional omics data, such as transcriptomics, and exploring advanced deep learning models to further enhance predictive capabilities.Keywords: antibiotic resistance, streptococcus pyogenes, virulence factors., machine learning
Procedia PDF Downloads 295381 Comparative Study Performance of the Induction Motor between SMC and NLC Modes Control
Authors: A. Oukaci, R. Toufouti, D. Dib, l. Atarsia
Abstract:
This article presents a multitude of alternative techniques to control the vector control, namely the nonlinear control and sliding mode control. Moreover, the implementation of their control law applied to the high-performance to the induction motor with the objective to improve the tracking control, ensure stability robustness to parameter variations and disturbance rejection. Tests are performed numerical simulations in the Matlab/Simulink interface, the results demonstrate the efficiency and dynamic performance of the proposed strategy.Keywords: Induction Motor (IM), Non-linear Control (NLC), Sliding Mode Control (SMC), nonlinear sliding surface
Procedia PDF Downloads 5705380 A New Paradigm to Make Cloud Computing Greener
Authors: Apurva Saxena, Sunita Gond
Abstract:
Demand of computation, data storage in large amount are rapidly increases day by day. Cloud computing technology fulfill the demand of today’s computation but this will lead to high power consumption in cloud data centers. Initiative for Green IT try to reduce power consumption and its adverse environmental impacts. Paper also focus on various green computing techniques, proposed models and efficient way to make cloud greener.Keywords: virtualization, cloud computing, green computing, data center
Procedia PDF Downloads 5505379 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla
Abstract:
Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.Keywords: fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time
Procedia PDF Downloads 1745378 Cross Site Scripting (XSS) Attack and Automatic Detection Technology Research
Authors: Tao Feng, Wei-Wei Zhang, Chang-Ming Ding
Abstract:
Cross-site scripting (XSS) is one of the most popular WEB Attacking methods at present, and also one of the most risky web attacks. Because of the population of JavaScript, the scene of the cross site scripting attack is also gradually expanded. However, since the web application developers tend to only focus on functional testing and lack the awareness of the XSS, which has made the on-line web projects exist many XSS vulnerabilities. In this paper, different various techniques of XSS attack are analyzed, and a method automatically to detect it is proposed. It is easy to check the results of vulnerability detection when running it as a plug-in.Keywords: XSS, no target attack platform, automatic detection,XSS detection
Procedia PDF Downloads 4005377 Mesoporous Material Nanofibers by Electrospinning
Authors: Sh. Sohrabnezhad, A. Jafarzadeh
Abstract:
In this paper, MCM-41 mesoporous material nanofibers were synthesized by an electrospinning technique. The nanofibers were characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), x-ray diffraction (XRD), and nitrogen adsorption–desorption measurement. Tetraethyl orthosilicate (TEOS) and polyvinyl alcohol (PVA) were used as a silica source and fiber forming source, respectively. TEM and SEM images showed synthesis of MCM-41 nanofibers with a diameter of 200 nm. The pore diameter and surface area of calcined MCM-41 nanofibers was 2.2 nm and 970 m2/g, respectively. The morphology of the MCM-41 nanofibers depended on spinning voltages.Keywords: electrospinning, electron microscopy, fiber technology, porous materials, X-ray techniques
Procedia PDF Downloads 2465376 Investigation of Yard Seam Workings for the Proposed Newcastle Light Rail Project
Authors: David L. Knott, Robert Kingsland, Alistair Hitchon
Abstract:
The proposed Newcastle Light Rail is a key part of the revitalisation of Newcastle, NSW and will provide a frequent and reliable travel option throughout the city centre, running from Newcastle Interchange at Wickham to Pacific Park in Newcastle East, a total of 2.7 kilometers in length. Approximately one-third of the route, along Hunter and Scott Streets, is subject to potential shallow underground mine workings. The extent of mining and seams mined is unclear. Convicts mined the Yard Seam and overlying Dudley (Dirty) Seam in Newcastle sometime between 1800 and 1830. The Australian Agricultural Company mined the Yard Seam from about 1831 to the 1860s in the alignment area. The Yard Seam was about 3 feet (0.9m) thick, and therefore, known as the Yard Seam. Mine maps do not exist for the workings in the area of interest and it was unclear if both or just one seam was mined. Information from 1830s geological mapping and other data showing shaft locations were used along Scott Street and information from the 1908 Royal Commission was used along Hunter Street to develop an investigation program. In addition, mining was encountered for several sites to the south of the alignment at depths of about 7 m to 25 m. Based on the anticipated depths of mining, it was considered prudent to assess the potential for sinkhole development on the proposed alignment and realigned underground utilities and to obtain approval for the work from Subsidence Advisory NSW (SA NSW). The assessment consisted of a desktop study, followed by a subsurface investigation. Four boreholes were drilled along Scott Street and three boreholes were drilled along Hunter Street using HQ coring techniques in the rock. The placement of boreholes was complicated by the presence of utilities in the roadway and traffic constraints. All the boreholes encountered the Yard Seam, with conditions varying from unmined coal to an open void, indicating the presence of mining. The geotechnical information obtained from the boreholes was expanded by using various downhole techniques including; borehole camera, borehole sonar, and downhole geophysical logging. The camera provided views of the rock and helped to explain zones of no recovery. In addition, timber props within the void were observed. Borehole sonar was performed in the void and provided an indication of room size as well as the presence of timber props within the room. Downhole geophysical logging was performed in the boreholes to measure density, natural gamma, and borehole deviation. The data helped confirm that all the mining was in the Yard Seam and that the overlying Dudley Seam had been eroded in the past over much of the alignment. In summary, the assessment allowed the potential for sinkhole subsidence to be assessed and a mitigation approach developed to allow conditional approval by SA NSW. It also confirmed the presence of mining in the Yard Seam, the depth to the seam and mining conditions, and indicated that subsidence did not appear to have occurred in the past.Keywords: downhole investigation techniques, drilling, mine subsidence, yard seam
Procedia PDF Downloads 3135375 Elements of Creativity and Innovation
Authors: Fadwa Al Bawardi
Abstract:
In March 2021, the Saudi Arabian Council of Ministers issued a decision to form a committee called the "Higher Committee for Research, Development and Innovation," a committee linked to the Council of Economic and Development Affairs, chaired by the Chairman of the Council of Economic and Development Affairs, and concerned with the development of the research, development and innovation sector in the Kingdom. In order to talk about the dimensions of this wonderful step, let us first try to answer the following questions. Is there a difference between creativity and innovation..? What are the factors of creativity in the individual. Are they mental genetic factors or are they factors that an individual acquires through learning..? The methodology included surveys that have been conducted on more than 500 individuals, males and females, between the ages of 18 till 60. And the answer is. "Creativity" is the creation of a new idea, while "Innovation" is the development of an already existing idea in a new, successful way. They are two sides of the same coin, as the "creative idea" needs to be developed and transformed into an "innovation" in order to achieve either strategic achievements at the level of countries and institutions to enhance organizational intelligence, or achievements at the level of individuals. For example, the beginning of smart phones was just a creative idea from IBM in 1994, but the actual successful innovation for the manufacture, development and marketing of these phones was through Apple later. Nor does creativity have to be hereditary. There are three basic factors for creativity: The first factor is "the presence of a challenge or an obstacle" that the individual faces and seeks thinking to find solutions to overcome, even if thinking requires a long time. The second factor is the "environment surrounding" of the individual, which includes science, training, experience gained, the ability to use techniques, as well as the ability to assess whether the idea is feasible or otherwise. To achieve this factor, the individual must be aware of own skills, strengths, hobbies, and aspects in which one can be creative, and the individual must also be self-confident and courageous enough to suggest those new ideas. The third factor is "Experience and the Ability to Accept Risk and Lack of Initial Success," and then learn from mistakes and try again tirelessly. There are some tools and techniques that help the individual to reach creative and innovative ideas, such as: Mind Maps tool, through which the available information is drawn by writing a short word for each piece of information and arranging all other relevant information through clear lines, which helps in logical thinking and correct vision. There is also a tool called "Flow Charts", which are graphics that show the sequence of data and expected results according to an ordered scenario of events and workflow steps, giving clarity to the ideas, their sequence, and what is expected of them. There are also other great tools such as the Six Hats tool, a useful tool to be applied by a group of people for effective planning and detailed logical thinking, and the Snowball tool. And all of them are tools that greatly help in organizing and arranging mental thoughts, and making the right decisions. It is also easy to learn, apply and use all those tools and techniques to reach creative and innovative solutions. The detailed figures and results of the conducted surveys are available upon request, with charts showing the %s based on gender, age groups, and job categories.Keywords: innovation, creativity, factors, tools
Procedia PDF Downloads 545374 Analysis of the Evolution of Techniques and Review in Cleft Surgery
Authors: Tomaz Oliveira, Rui Medeiros, André Lacerda
Abstract:
Introduction: Cleft lip and/or palate are the most frequent forms of congenital craniofacial anomalies, affecting mainly the middle third of the face and manifesting by functional and aesthetic changes. Bilateral cleft lip represents a reconstructive surgical challenge, not only for the labial component but also for the associated nasal deformation. Recently, the paradigm of the approach to this pathology has changed, placing the focus on muscle reconstruction and anatomical repositioning of the nasal cartilages in order to obtain the best aesthetic and functional results. The aim of this study is to carry out a systematic review of the surgical approach to bilateral cleft lip, retrospectively analyzing the case series of Plastic Surgery Service at Hospital Santa Maria (Lisbon, Portugal) regarding this pathology, the global assessment of the characteristics of the operated patients and the study of the different surgical approaches and their complications in the last 20 years. Methods: The present work demonstrates a retrospective and descriptive study of patients who underwent at least one reconstructive surgery for cleft lip and/or palate, in the CPRE service of the HSM, in the period between January 1 of 1997 and December 31 of 2017, in which the data relating to 361 individuals were analyzed who, after applying the exclusion criteria, constituted a sample of 212 participants. The variables analyzed were the year of the first surgery, gender, age, type of orofacial cleft, surgical approach, and its complications. Results: There was a higher overall prevalence in males, with cleft lip and cleft palate occurring in greater proportion in males, with the cleft palate being more common in females. The most frequently recorded malformation was cleft lip and palate, which is complete in most cases. Regarding laterality, alterations with a unilateral labial component were the most commonly observed, with the left lip being described as the most affected. It was found that the vast majority of patients underwent primary intervention up to 12 months of age. The surgical techniques used in the approach to this pathology showed an important chronological variation over the years. Discussion: Cleft lip and/or palate is a medical condition associated with high aesthetic and functional morbidity, which requires early treatment in order to optimize the long-term outcome. The existence of a nasolabial component and its respective surgical correction plays a central role in the treatment of this pathology. The high rates of post-surgical complications and unconvincing aesthetic results have motivated an evolution of the surgical technique, increasingly evident in recent years, allowing today to achieve satisfactory aesthetic results, even in bilateral cleft lip with high deformation complexity. The introduction of techniques that favor nasolabial reconstruction based on anatomical principles has been producing increasingly convincing results. The analyzed sample shows that most of the results obtained in this study are, in general, compatible with the results published in the literature. Conclusion: This work showed that the existence of small variations in the surgical technique can bring significant improvements in the functional and aesthetic results in the treatment of bilateral cleft lip.Keywords: cleft lip, palate lip, congenital abnormalities, cranofacial malformations
Procedia PDF Downloads 1085373 Stabilization of Expansive Soils by Additions Binders Hydraulic Lime and Cement
Authors: Kherafa Abdennasser
Abstract:
A literature review was conducted to gather as much information. Concerns the phenomenon of swelling clays, as well as a presentation of some bibliographic findings on factors affecting the swelling potential. Citing the various techniques of stabilization of clays as well as a presentation of some literature results on the stabilization of swelling. Then a characterization of the materials was carried out at basic bibliographic study. These are standard mechanical geotechnical testing. Simple practical, economical and efficient to minimize the phenomenon swelling.Keywords: stabilization, expansive soils, cement, lime, oedometer
Procedia PDF Downloads 3125372 Examining French Teachers’ Teaching and Learning Approaches in Some Selected Junior High Schools in Ghana
Authors: Paul Koffitse Agobia
Abstract:
In 2020 the Ministry of Education in Ghana and the National Council for Curriculum and Assessment (NaCCA) rolled out a new curriculum, Common Core Programme (CCP) for Basic 7 to 10, that lays emphasis on character building and values which are important to the Ghanaian society by providing education that will produce character–minded learners, with problem solving skills, who can play active roles in dealing with the increasing challenges facing Ghana and the global society. Therefore, learning and teaching approaches that prioritise the use of digital learning resources and active learning are recommended. The new challenge facing Ghanaian teachers is the ability to use new technologies together with the appropriate content pedagogical knowledge to help learners develop, aside the communication skills in French, the essential 21st century skills as recommended in the new curriculum. This article focusses on the pedagogical approaches that are recommended by NaCCA. The study seeks to examine French language teachers’ understanding of the recommended pedagogical approaches and how they use digital learning resources in class to foster the development of these essential skills and values. 54 respondents, comprised 30 teachers and 24 head teachers, were selected in 6 Junior High schools in rural districts (both private and public) and 6 from Junior High schools in an urban setting. The schools were selected in three regions: Volta, Central and Western regions. A class observation checklist and an interview guide were used to collect data for the study. The study reveals that some teachers adopt teaching techniques that do not promote active learning. They demonstrate little understanding of the core competences and values, therefore, fail to integrate them in their lessons. However, some other teachers, despite their lack of understanding of learning and teaching philosophies, adopted techniques that can help learners develop some of the core competences and values. In most schools, digital learning resources are not utilized, though teachers have smartphones or laptops.Keywords: active learning, core competences, digital learning resources, pedagogical approach, values.
Procedia PDF Downloads 735371 Cu₂(ZnSn)(S)₄ Electrodeposition from a Single Bath for Photovoltaic Applications
Authors: Mahfouz Saeed
Abstract:
Cu₂(ZnSn)(S)₄ (CTZS) offers potential advantages over CuInGaSe₂ (CIGS) as solar thin film because to its higher band gap. Preparing such photovoltaic materials by electrochemical techniques is particularly attractive due to the lower processing cost and the high throughput of such techniques. Several recent publications report CTZS electroplating; however, the electrochemical process still facing serious challenges such as a sulfur atomic ration which is about 50% of the total alloy. We introduce in this work an improved electrolyte composition which enables the direct electrodeposition of CTZS from a single bath. The electrolyte is significantly more dilute in comparison to common baths described in the literature. The bath composition we introduce is: 0.0032 M CuSO₄, 0.0021 M ZnSO₄, 0.0303 M SnCl₂, 0.0038 M Na₂S₂O₃, and 0.3 mM Na₂S₂O3. PHydrion is applied to buffer the electrolyte to pH=2, and 0.7 M LiCl is applied as supporting electrolyte. Electrochemical process was carried at a rotating disk electrode which provides quantitative characterization of the flow (room temperature). Comprehensive electrochemical behavior study at different electrode rotation rates are provided. The effects of agitation on atomic composition of the deposit and its adhesion to the molybdenum back contact are discussed. The post treatment annealing was conducted under sulfur atmosphere with no need for metals addition from the gas phase during annealing. The potential which produced the desired atomic ratio of CTZS at -0.82 V/NHE. Smooth deposit, with uniform composition across the sample surface and depth was obtained at 500 rpm rotation speed. Final sulfur atomic ratio was adjusted to 50.2% in order to have the desired atomic ration. The final composition was investigated using Energy-dispersive X-ray spectroscopy technique (EDS). XRD technique used to analyze CTZS crystallography and thickness. Complete and functional CTZS PV devices were fabricated by depositing all the required layers in the correct order and the desired optical properties. Acknowledgments: Case Western Reserve University for the technical help and for using their instruments.Keywords: photovoltaic, CTZS, thin film, electrochemical
Procedia PDF Downloads 2395370 Laser Additive Manufacturing: A Literature Review
Authors: Pranav Mohan Parki, C. Mallika Parveen, Tahseen Ahmad Khan, Mihika Shivkumar
Abstract:
Additive manufacturing (AM) is one of the several manufacturing processes in use today. AM comprises of techniques such as ‘Selective Laser Sintering’ and ‘Selective Laser Melting’ etc. along with other equipment and materials has been developed way back in 1980s, although major use of these methods has risen during the last decade. AM seems to be the most efficient way when compared to the traditional machining procedures. Still many problems continue to hinder its progress to becoming the most widely used of all. This paper contributes to the better understanding of AM and also aims at providing viable solutions to these problems, which may further help in enabling AM to become the most flaw free production method.Keywords: additive manufacturing (AM), 3D printing, prototype, laser sintering
Procedia PDF Downloads 3785369 Parasitological Tracking of Wild Passerines in Group for the Rehabilitation of Native Fauna and Its Habitat
Authors: Catarina Ferreira Rebelo, Luis Madeira de Carvalho, Fernando González González
Abstract:
The order Passeridae corresponds to the richest and most abundant group of birds, with approximately 6500 species, making it possible to assert that two out of every three bird species are passerines. They are globally distributed and exhibit remarkable morphological and ecological variability. While numerous species of parasites have been identified and described in wild birds, there has been little focus on passeriformes. Seventeen passerines admitted to GREFA, a Wildlife Rehabilitation Center, throughout the months of October, November and December 2022 were analyzed. The species included Aegithalos caudatus, Anthus pratensis, Carduelis chloris, Certhia brachydactyla, Erithacus rubecula, Fringilla coelebs, Parus ater, Passer domesticus, Sturnus unicolor, Sylvia atricapilla, Turdus merula and Turdus philomelos. Data regarding past history was collected, and necropsies were conducted to identify the cause of death and body condition and determine the presence of parasites. Additionally, samples of intestinal content were collected for direct/fecal smear, flotation and sedimentation techniques. Sixteen (94.1%) passerines were considered positive for the presence of parasitic forms in at least one of the techniques used, including parasites detected in necropsy. Adult specimens of both sexes and tritonymphs of Monojoubertia microhylla and ectoparasites of the genus Ornithonyssus were identified. Macroscopic adult endoparasitic forms were also found during necropsies, including Diplotriaena sp., Serratospiculum sp. and Porrocaecum sp.. Parasitism by coccidia was observed with no sporulation. Additionally, eggs of nematodes from various genera were detected, such as Diplotriaena sp., Capillaria sp., Porrocaecum sp., Syngamus sp. and Strongyloides sp., eggs of trematodes, specifically the genus Brachylecithum and cestode oncospheres, whose genera were not identified. To our knowledge, the respiratory nematode Serratospiculum sp. found in this study is being reported for the first time in passerines in the Iberian Peninsula, along with the application of common coprological techniques for the identification of eggs in the intestinal content. The majority of parasites identified utilize intermediary hosts present in the diet of the passerines sampled. Furthermore, the discovery of certain parasites with a direct life cycle could potentially exert greater influence, particularly in specific scenarios such as within nests or during the rehabilitation process in wildlife centers. These parasites may impact intraspecific competition, increase susceptibility to predators or lead to death. However, their cost to wild birds is often not clear, as individuals can endure various parasites without significant harm. Furthermore, wild birds serve as important sources of parasites across different animal groups, including humans and other mammals. This study provides invaluable insights into the parasitic fauna of these birds, not only serving as a cornerstone for future epidemiological investigations but also enhancing our comprehension of these avian species.Keywords: birds, parasites, passerines, wild, spain
Procedia PDF Downloads 405368 Enhancing Teaching of Engineering Mathematics
Authors: Tajinder Pal Singh
Abstract:
Teaching of mathematics to engineering students is an open ended problem in education. The main goal of mathematics learning for engineering students is the ability of applying a wide range of mathematical techniques and skills in their engineering classes and later in their professional work. Most of the undergraduate engineering students and faculties feels that no efforts and attempts are made to demonstrate the applicability of various topics of mathematics that are taught thus making mathematics unavoidable for some engineering faculty and their students. The lack of understanding of concepts in engineering mathematics may hinder the understanding of other concepts or even subjects. However, for most undergraduate engineering students, mathematics is one of the most difficult courses in their field of study. Most of the engineering students never understood mathematics or they never liked it because it was too abstract for them and they could never relate to it. A right balance of application and concept based teaching can only fulfill the objectives of teaching mathematics to engineering students. It will surely improve and enhance their problem solving and creative thinking skills. In this paper, some practical (informal) ways of making mathematics-teaching application based for the engineering students is discussed. An attempt is made to understand the present state of teaching mathematics in engineering colleges. The weaknesses and strengths of the current teaching approach are elaborated. Some of the causes of unpopularity of mathematics subject are analyzed and a few pragmatic suggestions have been made. Faculty in mathematics courses should spend more time discussing the applications as well as the conceptual underpinnings rather than focus solely on strategies and techniques to solve problems. They should also introduce more ‘word’ problems as these problems are commonly encountered in engineering courses. Overspecialization in engineering education should not occur at the expense of (or by diluting) mathematics and basic sciences. The role of engineering education is to provide the fundamental (basic) knowledge and to teach the students simple methodology of self-learning and self-development. All these issues would be better addressed if mathematics and engineering faculty join hands together to plan and design the learning experiences for the students who take their classes. When faculties stop competing against each other and start competing against the situation, they will perform better. Without creating any administrative hassles these suggestions can be used by any young inexperienced faculty of mathematics to inspire engineering students to learn engineering mathematics effectively.Keywords: application based learning, conceptual learning, engineering mathematics, word problem
Procedia PDF Downloads 2305367 Analysis for Shear Spinning of Tubes with Hard-To-Work Materials
Authors: Sukhwinder Singh Jolly
Abstract:
Metal spinning is one such process in which the stresses are localized to a small area and the material is made to flow or move over the mandrel with the help of spinning tool. Spinning of tubular products can be performed by two techniques, forward spinning and backward spinning. Many researchers have studied the process both experimentally and analytically. An effort has been made to apply the process to the spinning of thin wall, highly precision, small bore long tube in hard-to-work materials such as titanium.Keywords: metal spinning, hard-to-work materials, roller diameter, power consumption
Procedia PDF Downloads 3865366 Development of Thermal Regulating Textile Material Consisted of Macrocapsulated Phase Change Material
Authors: Surini Duthika Fernandopulle, Kalamba Arachchige Pramodya Wijesinghe
Abstract:
Macrocapsules containing phase change material (PCM) PEG4000 as core and Calcium Alginate as the shell was synthesized by in-situ polymerization process, and their suitability for textile applications was studied. PCM macro-capsules were sandwiched between two polyurethane foams at regular intervals, and the sandwiched foams were subsequently covered with 100% cotton woven fabrics. According to the mathematical modelling and calculations 46 capsules were required to provide cooling for a period of 2 hours at 56ºC, so a panel of 10 cm x 10 cm area with 25 parts (having 5 capsules in each for 9 parts are 16 parts spaced for air permeability) were effectively merged into one textile material without changing the textile's original properties. First, the available cooling techniques related to textiles were considered and the best cooling techniques suiting the Sri Lankan climatic conditions were selected using a survey conducted for Sri Lankan Public based on ASHRAE-55-2010 standard and it consisted of 19 questions under 3 sections categorized as general information, thermal comfort sensation and requirement of Personal Cooling Garments (PCG). The results indicated that during daytime, majority of respondents feel warm and during nighttime also majority have responded as slightly warm. The survey also revealed that around 85% of the respondents are willing to accept a PCG. The developed panels were characterized using Fourier-transform infrared spectroscopy (FTIR) and Thermogravimetric Analysis (TGA) tests and the findings from FTIR showed that the macrocapsules consisted of PEG 4000 as the core material and Calcium Alginate as the shell material and findings from TGA showed that the capsules had the average weight percentage for core with 61,9% and shell with 34,7%. After heating both control samples and samples incorporating PCM panels, it was discovered that only the temperature of the control sample increased after 56ºC, whereas the temperature of the sample incorporating PCM panels began to regulate the temperature at 56ºC, preventing a temperature increase beyond 56ºC.Keywords: phase change materials, thermal regulation, textiles, macrocapsules
Procedia PDF Downloads 1275365 Applying Unmanned Aerial Vehicle on Agricultural Damage: A Case Study of the Meteorological Disaster on Taiwan Paddy Rice
Authors: Chiling Chen, Chiaoying Chou, Siyang Wu
Abstract:
Taiwan locates at the west of Pacific Ocean and intersects between continental and marine climate. Typhoons frequently strike Taiwan and come with meteorological disasters, i.e., heavy flooding, landslides, loss of life and properties, etc. Global climate change brings more extremely meteorological disasters. So, develop techniques to improve disaster prevention and mitigation is needed, to improve rescue processes and rehabilitations is important as well. In this study, UAVs (Unmanned Aerial Vehicles) are applied to take instant images for improving the disaster investigation and rescue processes. Paddy rice fields in the central Taiwan are the study area. There have been attacked by heavy rain during the monsoon season in June 2016. UAV images provide the high ground resolution (3.5cm) with 3D Point Clouds to develop image discrimination techniques and digital surface model (DSM) on rice lodging. Firstly, image supervised classification with Maximum Likelihood Method (MLD) is used to delineate the area of rice lodging. Secondly, 3D point clouds generated by Pix4D Mapper are used to develop DSM for classifying the lodging levels of paddy rice. As results, discriminate accuracy of rice lodging is 85% by image supervised classification, and the classification accuracy of lodging level is 87% by DSM. Therefore, UAVs not only provide instant images of agricultural damage after the meteorological disaster, but the image discriminations on rice lodging also reach acceptable accuracy (>85%). In the future, technologies of UAVs and image discrimination will be applied to different crop fields. The results of image discrimination will be overlapped with administrative boundaries of paddy rice, to establish GIS-based assist system on agricultural damage discrimination. Therefore, the time and labor would be greatly reduced on damage detection and monitoring.Keywords: Monsoon, supervised classification, Pix4D, 3D point clouds, discriminate accuracy
Procedia PDF Downloads 2995364 Using Textual Pre-Processing and Text Mining to Create Semantic Links
Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo
Abstract:
This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.Keywords: semantic links, data mining, linked data, SKOS
Procedia PDF Downloads 1785363 Hydraulic Characteristics of Mine Tailings by Metaheuristics Approach
Authors: Akhila Vasudev, Himanshu Kaushik, Tadikonda Venkata Bharat
Abstract:
A large number of mine tailings are produced every year as part of the extraction process of phosphates, gold, copper, and other materials. Mine tailings are high in water content and have very slow dewatering behavior. The efficient design of tailings dam and economical disposal of these slurries requires the knowledge of tailings consolidation behavior. The large-strain consolidation theory closely predicts the self-weight consolidation of these slurries as the theory considers the conservation of mass and momentum conservation and considers the hydraulic conductivity as a function of void ratio. Classical laboratory techniques, such as settling column test, seepage consolidation test, etc., are expensive and time-consuming for the estimation of hydraulic conductivity variation with void ratio. Inverse estimation of the constitutive relationships from the measured settlement versus time curves is explored. In this work, inverse analysis based on metaheuristics techniques will be explored for predicting the hydraulic conductivity parameters for mine tailings from the base excess pore water pressure dissipation curve and the initial conditions of the mine tailings. The proposed inverse model uses particle swarm optimization (PSO) algorithm, which is based on the social behavior of animals searching for food sources. The finite-difference numerical solution of the forward analytical model is integrated with the PSO algorithm to solve the inverse problem. The method is tested on synthetic data of base excess pore pressure dissipation curves generated using the finite difference method. The effectiveness of the method is verified using base excess pore pressure dissipation curve obtained from a settling column experiment and further ensured through comparison with available predicted hydraulic conductivity parameters.Keywords: base excess pore pressure, hydraulic conductivity, large strain consolidation, mine tailings
Procedia PDF Downloads 1305362 Monocrystalline Silicon Surface Passivation by Porous Silicon
Authors: Mohamed Ben Rabha
Abstract:
In this paper, we report on the effect of porous silicon (PS) treatment on the surface passivation of monocrystalline silicon (c-Si). PS film with a thickness of 80 nm was deposited by stain etching. It was demonstrated that PS coating is a very interesting solution for surface passivation. The level of surface passivation is determined by techniques based on photoconductance and FTIR. As a results, the effective minority carrier lifetime increase from 2 µs to 7 µs at ∆n=1015 cm-3 and the reflectivity reduce from 28 % to about 7 % after PS coating.Keywords: porous silicon, effective minority carrier lifetime, reflectivity
Procedia PDF Downloads 4435361 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 665360 Non-Invasive Evaluation of Patients After Percutaneous Coronary Revascularization. The Role of Cardiac Imaging
Authors: Abdou Elhendy
Abstract:
Numerous study have shown the efficacy of the percutaneous intervention (PCI) and coronary stenting in improving left ventricular function and relieving exertional angina. Furthermore, PCI remains the main line of therapy in acute myocardial infarction. Improvement of procedural techniques and new devices have resulted in an increased number of PCI in those with difficult and extensive lesions, multivessel disease as well as total occlusion. Immediate and late outcome may be compromised by acute thrombosis or the development of fibro-intimal hyperplasia. In addition, progression of coronary artery disease proximal or distal to the stent as well as in non-stented arteries is not uncommon. As a result, complications can occur, such as acute myocardial infarction, worsened heart failure or recurrence of angina. In a stent, restenosis can occur without symptoms or with atypical complaints rendering the clinical diagnosis difficult. Routine invasive angiography is not appropriate as a follow up tool due to associated risk and cost and the limited functional assessment. Exercise and pharmacologic stress testing are increasingly used to evaluate the myocardial function, perfusion and adequacy of revascularization. Information obtained by these techniques provide important clues regarding presence and severity of compromise in myocardial blood flow. Stress echocardiography can be performed in conjunction with exercise or dobutamine infusion. The diagnostic accuracy has been moderate, but the results provide excellent prognostic stratification. Adding myocardial contrast agents can improve imaging quality and allows assessment of both function and perfusion. Stress radionuclide myocardial perfusion imaging is an alternative to evaluate these patients. The extent and severity of wall motion and perfusion abnormalities observed during exercise or pharmacologic stress are predictors of survival and risk of cardiac events. According to current guidelines, stress echocardiography and radionuclide imaging are considered to have appropriate indication among patients after PCI who have cardiac symptoms and those who underwent incomplete revascularization. Stress testing is not recommended in asymptomatic patients, particularly early after revascularization, Coronary CT angiography is increasingly used and provides high sensitive for the diagnosis of coronary artery stenosis. Average sensitivity and specificity for the diagnosis of in stent stenosis in pooled data are 79% and 81%, respectively. Limitations include blooming artifacts and low feasibility in patients with small stents or thick struts. Anatomical and functional cardiac imaging modalities are corner stone for the assessment of patients after PCI and provide salient diagnostic and prognostic information. Current imaging techniques cans serve as gate keeper for coronary angiography, thus limiting the risk of invasive procedures to those who are likely to benefit from subsequent revascularization. The determination of which modality to apply requires careful identification of merits and limitation of each technique as well as the unique characteristic of each individual patient.Keywords: coronary artery disease, stress testing, cardiac imaging, restenosis
Procedia PDF Downloads 1655359 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 945358 A Survey of Domain Name System Tunneling Attacks: Detection and Prevention
Authors: Lawrence Williams
Abstract:
As the mechanism which converts domains to internet protocol (IP) addresses, Domain Name System (DNS) is an essential part of internet usage. It was not designed securely and can be subject to attacks. DNS attacks have become more frequent and sophisticated and the need for detecting and preventing them becomes more important for the modern network. DNS tunnelling attacks are one type of attack that are primarily used for distributed denial-of-service (DDoS) attacks and data exfiltration. Discussion of different techniques to detect and prevent DNS tunneling attacks is done. The methods, models, experiments, and data for each technique are discussed. A proposal about feasibility is made. Future research on these topics is proposed.Keywords: DNS, tunneling, exfiltration, botnet
Procedia PDF Downloads 745357 Numerical Solutions of Generalized Burger-Fisher Equation by Modified Variational Iteration Method
Authors: M. O. Olayiwola
Abstract:
Numerical solutions of the generalized Burger-Fisher are obtained using a Modified Variational Iteration Method (MVIM) with minimal computational efforts. The computed results with this technique have been compared with other results. The present method is seen to be a very reliable alternative method to some existing techniques for such nonlinear problems.Keywords: burger-fisher, modified variational iteration method, lagrange multiplier, Taylor’s series, partial differential equation
Procedia PDF Downloads 428