Search results for: virtual crack closure technique
7095 Numerical Buckling of Composite Cylindrical Shells under Axial Compression Using Asymmetric Meshing Technique (AMT)
Authors: Zia R. Tahir, P. Mandal
Abstract:
This paper presents the details of a numerical study of buckling and post buckling behaviour of laminated carbon fiber reinforced plastic (CFRP) thin-walled cylindrical shell under axial compression using asymmetric meshing technique (AMT) by ABAQUS. AMT is considered to be a new perturbation method to introduce disturbance without changing geometry, boundary conditions or loading conditions. Asymmetric meshing affects both predicted buckling load and buckling mode shapes. Cylindrical shell having lay-up orientation [0°/+45°/-45°/0°] with radius to thickness ratio (R/t) equal to 265 and length to radius ratio (L/R) equal to 1.5 is analysed numerically. A series of numerical simulations (experiments) are carried out with symmetric and asymmetric meshing to study the effect of asymmetric meshing on predicted buckling behaviour. Asymmetric meshing technique is employed in both axial direction and circumferential direction separately using two different methods, first by changing the shell element size and varying the total number elements, and second by varying the shell element size and keeping total number of elements constant. The results of linear analysis (Eigenvalue analysis) and non-linear analysis (Riks analysis) using symmetric meshing agree well with analytical results. The results of numerical analysis are presented in form of non-dimensional load factor, which is the ratio of buckling load using asymmetric meshing technique to buckling load using symmetric meshing technique. Using AMT, load factor has about 2% variation for linear eigenvalue analysis and about 2% variation for non-linear Riks analysis. The behaviour of load end-shortening curve for pre-buckling is same for both symmetric and asymmetric meshing but for asymmetric meshing curve behaviour in post-buckling becomes extraordinarily complex. The major conclusions are: different methods of AMT have small influence on predicted buckling load and significant influence on load displacement curve behaviour in post buckling; AMT in axial direction and AMT in circumferential direction have different influence on buckling load and load displacement curve in post-buckling.Keywords: CFRP composite cylindrical shell, asymmetric meshing technique, primary buckling, secondary buckling, linear eigenvalue analysis, non-linear riks analysis
Procedia PDF Downloads 3537094 Coating of Polyelectrolyte Multilayer Thin Films on Poly(S/EGDMA) HIPE Loaded with Hydroxyapatite as a Scaffold for Tissue Engineering Application
Authors: Kornkanok Noulta, Pornsri Pakeyangkoon, Stephen T. Dubas, Pomthong Malakul, Manit Nithithanakul
Abstract:
In recent years, interest in the development of material for tissue engineering application has increased considerably. Poly(High Internal Phase Emulsion) (PolyHIPE) foam is a material that is good candidate for used in tissue engineering application due to its 3D structure and highly porous with interconnected pore. The PolyHIPE was prepared from poly (styrene/ethylene glycol dimethacrylate) through high internal phase emulsion polymerization technique and loaded with hydroxyapatite (HA) to improve biocompatibility. To further increase hydrophilicity of the obtained polyHIPE, layer-by-layer polyelectrolyte multilayers (PEM) technique was used. A surface property of polyHIPE was characterized by contact angle measurement. Morphology and pore size was observed by scanning electron microscope (SEM). The cell viability was revealed by the 3-(4, 5-dimethylthiazol-2-yl)-2, 5-diphenyltetrazolium bromide (MTT) assay technique.Keywords: polyelectrolyte multilayer thin film, high internal phase emulsion, polyhipe foam, scaffold, tissue engineering
Procedia PDF Downloads 3517093 New Model of Immersive Experiential Branding for International Universities
Authors: Kakhaber Djakeli
Abstract:
For market leadership, iconic brands already start to establish their unique digital avatars into Metaverse and offer Non Fungible Tokens to their fans. Metaverse can be defined as an evolutionary step of Internet development. So if companies and brands use the internet, logically, they can find new solutions for them and their customers in Metaverse. Marketing and Management today must learn how to combine physical world activities with those either entitled as digital, virtual, and immersive. A “Phygital” Solution uniting physical and digital competitive activities of the company covering the questions about how to use virtual worlds for Brand Development and Non Fungible Tokens for more attractiveness soon will be most relevant question for Branding. Thinking comprehensively, we can entitle this type of branding as an Immersive one. As we see, the Immersive Brands give customers more mesmerizing feelings than traditional ones. Accordingly, the Branding can be divided by the company in its own understanding into two models: traditional and immersive. Immersive Branding being more directed to Sensorial challenges of Humans will be big job for International Universities in near future because they target the Generation - Z. To try to help those International Universities opening the door to the mesmerizing, immersive branding, the Marketing Research have been undertaken. The main goal of the study was to establish the model for Immersive Branding at International Universities and answer on many questions what logically arises in university life. The type of Delphi Surveys entitled as an Expert Studies was undertaken for one great mission, to help International Universities to open the opportunities to Phygital activities with reliable knowledge with Model of Immersive Branding. The Questionnaire sent to Experts of Education were covering professional type of questions from education to segmentation of customers, branding, attitude to students, and knowledge to Immersive Marketing. The research results being very interesting and encouraging enough to make author to establish the New Model of Immersive Experiential Branding for International Universities.Keywords: branding, immersive marketing, students, university
Procedia PDF Downloads 817092 The Excess Loop Delay Calibration in a Bandpass Continuous-Time Delta Sigma Modulators Based on Q-Enhanced LC Filter
Authors: Sorore Benabid
Abstract:
The Q-enhanced LC filters are the most used architecture in the Bandpass (BP) Continuous-Time (CT) Delta-Sigma (ΣΔ) modulators, due to their: high frequencies operation, high linearity than the active filters and a high quality factor obtained by Q-enhanced technique. This technique consists of the use of a negative resistance that compensate the ohmic losses in the on-chip inductor. However, this technique introduces a zero in the filter transfer function which will affect the modulator performances in term of Dynamic Range (DR), stability and in-band noise (Signal-to-Noise Ratio (SNR)). In this paper, we study the effect of this zero and we demonstrate that a calibration of the excess loop delay (ELD) is required to ensure the best performances of the modulator. System level simulations are done for a 2ndorder BP CT (ΣΔ) modulator at a center frequency of 300MHz. Simulation results indicate that the optimal ELD should be reduced by 13% to achieve the maximum SNR and DR compared to the ideal LC-based ΣΔ modulator.Keywords: continuous-time bandpass delta-sigma modulators, excess loop delay, on-chip inductor, Q-enhanced LC filter
Procedia PDF Downloads 3297091 Optimal Allocation of Distributed Generation Sources for Loss Reduction and Voltage Profile Improvement by Using Particle Swarm Optimization
Authors: Muhammad Zaheer Babar, Amer Kashif, Muhammad Rizwan Javed
Abstract:
Nowadays distributed generation integration is best way to overcome the increasing load demand. Optimal allocation of distributed generation plays a vital role in reducing system losses and improves voltage profile. In this paper, a Meta heuristic technique is proposed for allocation of DG in order to reduce power losses and improve voltage profile. The proposed technique is based on Multi Objective Particle Swarm optimization. Fewer control parameters are needed in this algorithm. Modification is made in search space of PSO. The effectiveness of proposed technique is tested on IEEE 33 bus test system. Single DG as well as multiple DG scenario is adopted for proposed method. Proposed method is more effective as compared to other Meta heuristic techniques and gives better results regarding system losses and voltage profile.Keywords: Distributed generation (DG), Multi Objective Particle Swarm Optimization (MOPSO), particle swarm optimization (PSO), IEEE standard Test System
Procedia PDF Downloads 4547090 Research on Carbon Fiber Tow Spreading Technique with Multi-Rolls
Authors: Soon Ok Jo, Han Kyu Jeung, Si Woo Park
Abstract:
With the process of consistent expansion of carbon fiber in width (Carbon Fiber Tow Spreading Technique), it can be expected that such process can enhance the production of carbon fiber reinforced composite material and quality of the product. In this research, the method of mechanically expanding carbon fiber and increasing its width was investigated by using various geometric rolls. In addition, experimental type of carbon fiber expansion device was developed and tested using 12K carbon fiber. As a result, the effects of expansion of such fiber under optimized operating conditions and geometric structure of an elliptical roll, were analyzed.Keywords: carbon fiber, tow spreading fiber, pre-preg, roll structure
Procedia PDF Downloads 3497089 Scale Effects on the Wake Airflow of a Heavy Truck
Authors: Aude Pérard Lecomte, Georges Fokoua, Amine Mehel, Anne Tanière
Abstract:
Air quality in urban areas is deteriorated by pollution, mainly due to the constant increase of the traffic of different types of ground vehicles. In particular, particulate matter pollution with important concentrations in urban areas can cause serious health issues. Characterizing and understanding particle dynamics is therefore essential to establish recommendations to improve air quality in urban areas. To analyze the effects of turbulence on particulate pollutants dispersion, the first step is to focus on the single-phase flow structure and turbulence characteristics in the wake of a heavy truck model. To achieve this, Computational Fluid Dynamics (CFD) simulations were conducted with the aim of modeling the wake airflow of a full- and reduced-scale heavy truck. The Reynolds Average Navier-Stokes (RANS) approach with the Reynolds Stress Model (RSM)as the turbulence model closure was used. The simulations highlight the apparition of a large vortex coming from the under trailer. This vortex belongs to the recirculation region, located in the near-wake of the heavy truck. These vortical structures are expected to have a strong influence on particle dynamics that are emitted by the truck.Keywords: CDF, heavy truck, recirculation region, reduced scale
Procedia PDF Downloads 2187088 The Library as a Metaphor: Perceptions, Evolution, and the Shifting Role in Society Through a Librarian's Lens
Authors: Nihar Kanta Patra, Akhtar Hussain
Abstract:
This comprehensive study, through the perspective of librarians, explores the library as a metaphor and its profound significance in representing knowledge and learning. It delves into how librarians perceive the library as a metaphor and the ways in which it symbolizes the acquisition, preservation, and dissemination of knowledge. The research investigates the most common metaphors used to describe libraries, as witnessed by librarians, and analyzes how these metaphors reflect the evolving role of libraries in society. Furthermore, the study examines how the library metaphor influences the perception of librarians regarding academic libraries as physical places and academic library websites as virtual spaces, exploring their potential for learning and exploration. It investigates the evolving nature of the library as a metaphor over time, as seen by librarians, considering the changing landscape of information and technology. The research explores the ways in which the library metaphor has expanded beyond its traditional representation, encompassing digital resources, online connectivity, and virtual realms, and provides insights into its potential evolution in the future. Drawing on the experiences of librarians in their interactions with library users, the study uncovers any specific cultural or generational differences in how people interpret or relate to the library as a metaphor. It sheds light on the diverse perspectives and interpretations of the metaphor based on cultural backgrounds, educational experiences, and technological familiarity. Lastly, the study investigates the evolving roles of libraries as observed by librarians and explores how these changing roles can influence the metaphors we use to represent them. It examines the dynamic nature of libraries as they adapt to societal needs, technological advancements, and new modes of information dissemination. By analyzing these various dimensions, this research provides a comprehensive understanding of the library as a metaphor through the lens of librarians, illuminating its significance, evolution, and its transformative impact on knowledge, learning, and the changing role of libraries in society.Keywords: library, librarians, metaphor, perception
Procedia PDF Downloads 957087 Efficient Modeling Technique for Microstrip Discontinuities
Authors: Nassim Ourabia, Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The technique obtains closed form expressions for the equivalent circuits which are used to model these discontinuities. Then it would be easy to handle and to characterize complicated structures like T and Y junctions, truncated junctions, arbitrarily shaped junctions, cascading junctions, and more generally planar multiport junctions. Another advantage of this method is that the edge line concept for arbitrary shape junctions operates with real parameters circuits. The validity of the method was further confirmed by comparing our results for various discontinuities (bend, filters) with those from HFSS as well as from other published sources.Keywords: CAD analysis, contour integral approach, microwave circuits, s-parameters
Procedia PDF Downloads 5167086 Investigation on the stability of rock slopes subjected to tension cracks via limit analysis
Authors: Weigao. Wu, Stefano. Utili
Abstract:
Based on the kinematic approach of limit analysis, a full set of upper bound solutions for the stability of homogeneous rock slopes subjected to tension cracks are obtained. The generalized Hoek-Brown failure criterion is employed to describe the non-linear strength envelope of rocks. In this paper, critical failure mechanisms are determined for cracks of known depth but unspecified location, cracks of known location but unknown depth, and cracks of unspecified location and depth. It is shown that there is a nearly up to 50% drop in terms of the stability factors for the rock slopes intersected by a tension crack compared with intact ones. Tables and charts of solutions in dimensionless forms are presented for ease of use by practitioners.Keywords: Hoek-Brown failure criterion, limit analysis, rock slope, tension cracks
Procedia PDF Downloads 3447085 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 5527084 A Study of Electrowetting-Assisted Mold Filling in Nanoimprint Lithography
Authors: Wei-Hsuan Hsu, Yi-Xuan Huang
Abstract:
Nanoimprint lithography (NIL) possesses the advantages of sub-10-nm feature and low cost. NIL patterns the resist with physical deformation using a mold, which can easily reproduce the required nano-scale pattern. However, the variation of process parameters and environmental conditions seriously affect reproduction quality. How to ensure the quality of imprinted pattern is essential for industry. In this study, the authors used the electrowetting technology to assist mold filling in the NIL process. A special mold structure was designed to cause electrowetting. During the imprinting process, when a voltage was applied between the mold and substrate, the hydrophilicity/hydrophobicity of the surface of the mold can be converted. Both simulation and experiment confirmed that the electrowetting technology can assist mold filling and avoid incomplete filling rate. The proposed method can also reduce the crack formation during the de-molding process. Therefore, electrowetting technology can improve the process quality of NIL.Keywords: electrowetting, mold filling, nano-imprint, surface modification
Procedia PDF Downloads 1727083 TeleEmergency Medicine: Transforming Acute Care through Virtual Technology
Authors: Ashley L. Freeman, Jessica D. Watkins
Abstract:
TeleEmergency Medicine (TeleEM) is an innovative approach leveraging virtual technology to deliver specialized emergency medical care across diverse healthcare settings, including internal acute care and critical access hospitals, remote patient monitoring, and nurse triage escalation, in addition to external emergency departments, skilled nursing facilities, and community health centers. TeleEM represents a significant advancement in the delivery of emergency medical care, providing healthcare professionals the capability to deliver expertise that closely mirrors in-person emergency medicine, exceeding geographical boundaries. Through qualitative research, the extension of timely, high-quality care has proven to address the critical needs of patients in remote and underserved areas. TeleEM’s service design allows for the expansion of existing services and the establishment of new ones in diverse geographic locations. This ensures that healthcare institutions can readily scale and adapt services to evolving community requirements by leveraging on-demand (non-scheduled) telemedicine visits through the deployment of multiple video solutions. In terms of financial management, TeleEM currently employs billing suppression and subscription models to enhance accessibility for a wide range of healthcare facilities. Plans are in motion to transition to a billing system routing charges through a third-party vendor, further enhancing financial management flexibility. To address state licensure concerns, a patient location verification process has been integrated through legal counsel and compliance authorities' guidance. The TeleEM workflow is designed to terminate if the patient is not physically located within licensed regions at the time of the virtual connection, alleviating legal uncertainties. A distinctive and pivotal feature of TeleEM is the introduction of the TeleEmergency Medicine Care Team Assistant (TeleCTA) role. TeleCTAs collaborate closely with TeleEM Physicians, leading to enhanced service activation, streamlined coordination, and workflow and data efficiencies. In the last year, more than 800 TeleEM sessions have been conducted, of which 680 were initiated by internal acute care and critical access hospitals, as evidenced by quantitative research. Without this service, many of these cases would have necessitated patient transfers. Barriers to success were examined through thorough medical record review and data analysis, which identified inaccuracies in documentation leading to activation delays, limitations in billing capabilities, and data distortion, as well as the intricacies of managing varying workflows and device setups. TeleEM represents a transformative advancement in emergency medical care that nurtures collaboration and innovation. Not only has advanced the delivery of emergency medicine care virtual technology through focus group participation with key stakeholders, rigorous attention to legal and financial considerations, and the implementation of robust documentation tools and the TeleCTA role, but it’s also set the stage for overcoming geographic limitations. TeleEM assumes a notable position in the field of telemedicine by enhancing patient outcomes and expanding access to emergency medical care while mitigating licensure risks and ensuring compliant billing.Keywords: emergency medicine, TeleEM, rural healthcare, telemedicine
Procedia PDF Downloads 827082 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 397081 Circle Work as a Relational Praxis to Facilitate Collaborative Learning within Higher Education: A Decolonial Pedagogical Framework for Teaching and Learning in the Virtual Classroom
Authors: Jennifer Nutton, Gayle Ployer, Ky Scott, Jenny Morgan
Abstract:
Working in a circle within higher education creates a decolonial space of mutual respect, responsibility, and reciprocity that facilitates collaborative learning and deep connections among learners and instructors. This approach is beyond simply facilitating a group in a circle but opens the door to creating a sacred space connecting each member to the land, to the Indigenous peoples who have taken care of the lands since time immemorial, to one another, and to one’s own positionality. These deep connections not only center human knowledges and relationships but also acknowledges responsibilities to land. Working in a circle as a relational pedagogical praxis also disrupts institutional power dynamics by creating a space of collaborative learning and deep connections in the classroom. Inherent within circle work is to facilitate connections not just academically but emotionally, physically, culturally, and spiritually. Recent literature supports the use of online talking circles, finding that it can offer a more relational and experiential learning environment, which is often absent in the virtual world and has been made more evident and necessary since the pandemic. These deeper experiences of learning and connection, rooted in both knowledge and the land, can then be shared with openness and vulnerability with one another, facilitating growth and change. This process of beginning with the land is critical to ensure we have the grounding to obstruct the ongoing realities of colonialism. The authors, who identify as both Indigenous and non-Indigenous, as both educators and learners, reflect on their teaching and learning experiences in circle. They share a relational pedagogical praxis framework that has been successful in educating future social workers, environmental activists, and leaders in social and human services, health, legal and political fields.Keywords: circle work, relational pedagogies, decolonization, distance education
Procedia PDF Downloads 767080 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.Keywords: DNA microarray, feature selection, missing data, bioinformatics
Procedia PDF Downloads 5747079 Application of XRF and Other Principal Component Analysis for Counterfeited Gold Coin Characterization in Forensic Science
Authors: Somayeh Khanjani, Hamideh Abolghasemi, Hadi Shirzad, Samaneh Nabavi
Abstract:
At world market can be currently encountered a wide range of gemological objects that are incorrectly declared, treated, or it concerns completely different materials that try to copy precious objects more or less successfully. Counterfeiting of precious commodities is a problem faced by governments in most countries. Police have seized many counterfeit coins that looked like the real coins and because the feeling to the touch and the weight were very similar to those of real coins. Most people were fooled and believed that the counterfeit coins were real ones. These counterfeit coins may have been made by big criminal organizations. To elucidate the manufacturing process, not only the quantitative analysis of the coins but also the comparison of their morphological characteristics was necessary. Several modern techniques have been applied to prevent counterfeiting of coins. The objective of this study was to demonstrate the potential of X-ray Fluorescence (XRF) technique and the other analytical techniques for example SEM/EDX/WDX, FT-IR/ATR and Raman Spectroscopy. Using four elements (Cu, Ag, Au and Zn) and obtaining XRF for several samples, they could be discriminated. XRF technique and SEM/EDX/WDX are used for study of chemical composition. XRF analyzers provide a fast, accurate, nondestructive method to test the purity and chemistry of all precious metals. XRF is a very promising technique for rapid and non destructive counterfeit coins identification in forensic science.Keywords: counterfeit coins, X-ray fluorescence, forensic, FT-IR
Procedia PDF Downloads 4947078 A Pragmatic Approach of Memes Created in Relation to the COVID-19 Pandemic
Authors: Alexandra-Monica Toma
Abstract:
Internet memes are an element of computer mediated communication and an important part of online culture that combines text and image in order to generate meaning. This term coined by Richard Dawkings refers to more than a mere way to briefly communicate ideas or emotions, thus naming a complex and an intensely perpetuated phenomenon in the virtual environment. This paper approaches memes as a cultural artefact and a virtual trope that mirrors societal concerns and issues, and analyses the pragmatics of their use. Memes have to be analysed in series, usually relating to some image macros, which is proof of the interplay between imitation and creativity in the memes’ writing process. We believe that their potential to become viral relates to three key elements: adaptation to context, reference to a successful meme series, and humour (jokes, irony, sarcasm), with various pragmatic functions. The study also uses the concept of multimodality and stresses how the memes’ text interacts with the image, discussing three types of relations: symmetry, amplification, and contradiction. Moreover, the paper proves that memes could be employed as speech acts with illocutionary force, when the interaction between text and image is enriched through the connection to a specific situation. The features mentioned above are analysed in a corpus that consists of memes related to the COVID-19 pandemic. This corpus shows them to be highly adaptable to context, which helps build the feeling of connection and belonging in an otherwise tremendously fragmented world. Some of them are created based on well-known image macros, and their humour results from an intricate dialogue between texts and contexts. Memes created in relation to the COVID-19 pandemic can be considered speech acts and are often used as such, as proven in the paper. Consequently, this paper tackles the key features of memes, makes a thorough analysis of the memes sociocultural, linguistic, and situational context, and emphasizes their intertextuality, with special accent on their illocutionary potential.Keywords: context, memes, multimodality, speech acts
Procedia PDF Downloads 2007077 Influence of Titanium Oxide on Crystallization, Microstructure and Mechanical Behavior of Barium Fluormica Glass-Ceramics
Authors: Amit Mallik, Anil K. Barik, Biswajit Pal
Abstract:
The galloping advancement of research work on glass-ceramics stems from their wide applications in electronic industry and also to some extent in application oriented medical dentistry. TiO2, even in low concentration has been found to strongly influence the physical and mechanical properties of the glasses. Glass-ceramics is a polycrystalline ceramic material produced through controlled crystallization of glasses. Crystallization is accomplished by subjecting the suitable parent glasses to a regulated heat treatment involving the nucleation and growth of crystal phases in the glass. Mica glass-ceramics is a new kind of glass-ceramics based on the system SiO2•MgO•K2O•F. The predominant crystalline phase is synthetic fluormica, named fluorophlogopite. Mica containing glass-ceramics flaunt an exceptional feature of machinability apart from their unique thermal and chemical properties. Machinability arises from the randomly oriented mica crystals with a 'house of cards' microstructures allowing cracks to propagate readily along the mica plane but hindering crack propagation across the layers. In the present study, we have systematically investigated the crystallization, microstructure and mechanical behavior of barium fluorophlogopite mica-containing glass-ceramics of composition BaO•4MgO•Al2O3•6SiO2•2MgF2 nucleated by addition of 2, 4, 6 and 8 wt% TiO2. The glass samples were prepared by the melting technique. After annealing, different batches of glass samples for nucleation were fired at 730°C (2wt% TiO2), 720°C (4 wt% TiO2), 710°C (6 wt% TiO2) and 700°C (8 wt% TiO2) batches respectively for 2 h and ultimately heated to corresponding crystallization temperatures. The glass batches were analyzed by differential thermal analysis (DTA) and x-ray diffraction (XRD), scanning electron microscopy (SEM) and micro hardness indenter. From the DTA study, it is found that the fluorophlogopite mica crystallization exotherm appeared in the temperature range 886–903°C. Glass transition temperature (Tg) and crystallization peak temperature (Tp) increased with increasing TiO2 content up to 4 wt% beyond this weight% the glass transition temperature (Tg) and crystallization peak temperature (Tp) start to decrease with increasing TiO2 content up to 8 wt%. Scanning electron microscopy confirms the development of an interconnected ‘house of cards’ microstructure promoted by TiO2 as a nucleating agent. The increase in TiO2 content decreases the vicker’s hardness values in glass-ceramics.Keywords: crystallization, fluormica glass, ‘house of cards’ microstructure, hardness
Procedia PDF Downloads 2407076 An Efficient Clustering Technique for Copy-Paste Attack Detection
Authors: N. Chaitawittanun, M. Munlin
Abstract:
Due to rapid advancement of powerful image processing software, digital images are easy to manipulate and modify by ordinary people. Lots of digital images are edited for a specific purpose and more difficult to distinguish form their original ones. We propose a clustering method to detect a copy-move image forgery of JPEG, BMP, TIFF, and PNG. The process starts with reducing the color of the photos. Then, we use the clustering technique to divide information of measuring data by Hausdorff Distance. The result shows that the purposed methods is capable of inspecting the image file and correctly identify the forgery.Keywords: image detection, forgery image, copy-paste, attack detection
Procedia PDF Downloads 3387075 Training for Digital Manufacturing: A Multilevel Teaching Model
Authors: Luís Rocha, Adam Gąska, Enrico Savio, Michael Marxer, Christoph Battaglia
Abstract:
The changes observed in the last years in the field of manufacturing and production engineering, popularly known as "Fourth Industry Revolution", utilizes the achievements in the different areas of computer sciences, introducing new solutions at almost every stage of the production process, just to mention such concepts as mass customization, cloud computing, knowledge-based engineering, virtual reality, rapid prototyping, or virtual models of measuring systems. To effectively speed up the production process and make it more flexible, it is necessary to tighten the bonds connecting individual stages of the production process and to raise the awareness and knowledge of employees of individual sectors about the nature and specificity of work in other stages. It is important to discover and develop a suitable education method adapted to the specificities of each stage of the production process, becoming an extremely crucial issue to exploit the potential of the fourth industrial revolution properly. Because of it, the project “Train4Dim” (T4D) intends to develop complex training material for digital manufacturing, including content for design, manufacturing, and quality control, with a focus on coordinate metrology and portable measuring systems. In this paper, the authors present an approach to using an active learning methodology for digital manufacturing. T4D main objective is to develop a multi-degree (apprenticeship up to master’s degree studies) and educational approach that can be adapted to different teaching levels. It’s also described the process of creating the underneath methodology. The paper will share the steps to achieve the aims of the project (training model for digital manufacturing): 1) surveying the stakeholders, 2) Defining the learning aims, 3) producing all contents and curriculum, 4) training for tutors, and 5) Pilot courses test and improvements.Keywords: learning, Industry 4.0, active learning, digital manufacturing
Procedia PDF Downloads 977074 GABARAPL1 (GEC1) mRNA Expression Levels in Patients with Alzheimer's Disease
Authors: Ali Bayram, Burak Uz, Ilhan Dolasik, Remzi Yiğiter
Abstract:
The GABARAP (GABAA-receptor-associated protein) family consists of GABARAP, GABARAPL1 (GABARAP-like 1) and GABARAPL2 (GABARAP-like 2). GABARAPL1, like GABARAP, was described to interact with both GABAA receptor and tubulin, and to be involved in intracellular GABAA receptor trafficking and promoting tubulin polymerization. In addition, GABARAPL1 is thought to be involved in various physiological (autophagosome closure, regulation of circadian rhythms) and/or pathological mechanisms (cancer, neurodegeneration). Alzheimer’s disease (AD) is a progressive neuro degenerative disorder characterized with impaired cognitive functions. Disruption of the GABAergic neuro transmission as well as cholinergic and glutamatergic interactions, may also be involved in the pathogenesis of AD. GABARAPL1 presents a regulated tissue expression and is the most expressed gene among the GABARAP family members in the central nervous system. We, herein, conducted a study to investigate the GABARAPL1 mRNA expression levels in patients with AD. 50 patients with AD and 49 control patients were enrolled to the present study. Messenger RNA expression levels of GABARAPL1 were detected by real-time polymerase chain reaction. GABARAPL1 mRNA expression in AD / control patients was 0,495 (95% confidence interval: 0,404-0,607), p= 0,00000002646. Reduced activity of GABARAPL1 gene might play a role, at least partly, in the pathophysiology of AD.Keywords: Alzheimer’s disease, GABARAPL1, mRNA expression, RT-PCR
Procedia PDF Downloads 4587073 Calculation of Stress Intensity Factors in Rotating Disks Containing 3D Semi-Elliptical Cracks
Authors: Mahdi Fakoor, Seyed Mohammad Navid Ghoreishi
Abstract:
Initiation and propagation of cracks may cause catastrophic failures in rotating disks, and hence determination of fracture parameter in rotating disks under the different working condition is very important issue. In this paper, a comprehensive study of stress intensity factors in rotating disks containing 3D semi-elliptical cracks under the different working condition is investigated. In this regard, after verification of modeling and analytical procedure, the effects of mechanical properties, rotational velocity, and orientation of cracks on Stress Intensity Factors (SIF) in rotating disks under centrifugal loading are investigated. Also, the effects of using composite patch in reduction of SIF in rotating disks are studied. By that way, the effects of patching design variables like mechanical properties, thickness, and ply angle are investigated individually.Keywords: stress intensity factor, semi-elliptical crack, rotating disk, finite element analysis (FEA)
Procedia PDF Downloads 3647072 Computation of Thermal Stress Intensity Factor for Bonded Composite Repairs in Aircraft Structures
Authors: Fayçal Benyahia, Abdelmohsen Albedah, Bel Abbes Bachir Bouiadjra
Abstract:
In this study the Finite element method is used to analyse the effect of the thermal residual stresses resulting from adhesive curing on the performances of the bonded composite repair in aircraft structures. The stress intensity factor at the crack tip is chosen as fracture criterion in order to estimate the repair performances. The obtained results show that the presence of the thermal residual stresses reduces considerably the repair performances and consequently decreases the fatigue life of cracked structures. The effects of the curing temperature, the adhesive properties and the adhesive thickness on the Stress Intensity Factor (SIF) variation with thermal stresses are also analysed.Keywords: bonded composite repair, residual stress, adhesion, stress transfer, finite element analysis
Procedia PDF Downloads 4177071 Application of the Bionic Wavelet Transform and Psycho-Acoustic Model for Speech Compression
Authors: Chafik Barnoussi, Mourad Talbi, Adnane Cherif
Abstract:
In this paper we propose a new speech compression system based on the application of the Bionic Wavelet Transform (BWT) combined with the psychoacoustic model. This compression system is a modified version of the compression system using a MDCT (Modified Discrete Cosine Transform) filter banks of 32 filters each and the psychoacoustic model. This modification consists in replacing the banks of the MDCT filter banks by the bionic wavelet coefficients which are obtained from the application of the BWT to the speech signal to be compressed. These two methods are evaluated and compared with each other by computing bits before and bits after compression. They are tested on different speech signals and the obtained simulation results show that the proposed technique outperforms the second technique and this in term of compressed file size. In term of SNR, PSNR and NRMSE, the outputs speech signals of the proposed compression system are with acceptable quality. In term of PESQ and speech signal intelligibility, the proposed speech compression technique permits to obtain reconstructed speech signals with good quality.Keywords: speech compression, bionic wavelet transform, filterbanks, psychoacoustic model
Procedia PDF Downloads 3847070 Qf-Pcr as a Rapid Technique for Routine Prenatal Diagnosis of Fetal Aneuploidies
Authors: S. H. Atef
Abstract:
Background: The most common chromosomal abnormalities identified at birth are aneuploidies of chromosome 21, 18, 13, X and Y. Prenatal diagnosis of fetal aneuploidies is routinely done by traditional cytogenetic culture, a major drawback of this technique is the long period of time required to reach a diagnosis. In this study, we evaluated the QF-PCR as a rapid technique for prenatal diagnosis of common aneuploidies. Method:This work was carried out on Sixty amniotic fluid samples taken from patients with one or more of the following indications: Advanced maternal age (3 case), abnormal biochemical markers (6 cases), abnormal ultrasound (12 cases) or previous history of abnormal child (39 cases).Each sample was tested by QF-PCR and traditional cytogenetic. Aneuploidy screenings were performed amplifying four STRs on chromosomes 21, 18, 13, two pseudoautosomal,one X linked, as well as the AMXY and SRY; markers were distributed in two multiplex QFPCR assays (S1 and S2) in order to reduce the risk of sample mishandling. Results: All the QF-PCR results were successful, while there was two culture failures, only one of them was repeated. No discrepancy was seen between the results of both techniques. Fifty six samples showed normal patterns, three sample showed trisomy 21, successfully detected by both techniques and one sample showed normal pattern by QF-PCR but could not be compared to the cytogenetics due to culture failure, the pregnancy outcome of this case was a normal baby. Conclusion: Our study concluded that QF-PCR is a reliable technique for prenatal diagnosis of the common chromosomal aneuploidies. It has the advantages over the cytogenetic culture of being faster with the results appearing within 24-48 hours, simpler, doesn't need a highly qualified staff, less prone to failure and more cost effective.Keywords: QF-PCR, traditional cytogenetic fetal aneuploidies, trisomy 21, prenatal diagnosis
Procedia PDF Downloads 4177069 The Effect of Positional Release Technique versus Kinesio Tape on Iliocostalis lumborum in Back Myofascial Pain Syndrome
Authors: Shams Khaled Abdelrahman Abdallah Elbaz, Alaa Aldeen Abd Al Hakeem Balbaa
Abstract:
Purpose: The purpose of this study was to compare the effects of Positional Release Technique versus Kinesio Tape on pain level, pressure pain threshold level and functional disability in patients with back myofascial pain syndrome at iliocostalis lumborum. Backgrounds/significance: Myofascial Pain Syndrome is a common muscular pain syndrome that arises from trigger points which are hyperirritable, painful and tender points within a taut band of skeletal muscle. In more recent literature, about 75% of patients with musculoskeletal pain presenting to a community medical centres suffer from myofascial pain syndrome.Iliocostalis lumborum are most likely to develop active trigger points. Subjects: Thirty patients diagnosed as back myofascial pain syndrome with active trigger points in iliocostalis lumborum muscle bilaterally had participated in this study. Methods and materials: Patients were randomly distributed into two groups. The first group consisted of 15 patients (8 males and 7 females) with mean age 30.6 (±3.08) years, they received positional release technique which was applied 3 times per session, 3/week every other day for 2 weeks. The second group consisted of 15 patients(5 males, 10 females) with a mean age 30.4 (±3.35) years, they received kinesio tape which was applied and changed every 3 days with one day off for a total 3 times in 2 weeks. Both techniques were applied over trigger points of the iliocostalis lumborum bilaterally. Patients were evaluated pretreatment and posttreatment program for Pain intensity (Visual analogue scale), pressure pain threshold (digital pressure algometry), and functional disability (The Oswestry Disability Index). Analyses: Repeated measures MANOVA was used to detect differences within and between groups pre and post treatment. Then the univariate ANOVA test was conducted for the analysis of each dependant variable within and between groups. All statistical analyses were done using SPSS. with significance level set at p<0.05 throughout all analyses. Results: The results revealed that there was no significant difference between positional release technique and kinesio tape technique on pain level, pressure pain threshold and functional activities (p > 0.05). Both groups of patients showed significant improvement in all the measured variables (p < 0.05) evident by significant reduction of both pain intensity and functional disability as well as significant increase of pressure pain threshold Conclusions : Both positional release technique and kinesio taping technique are effective in reducing pain level, improving pressure pain threshold and improving function in treating patients who suffering from back myofascial pain syndrome at iliocostalis lumborum. As there was no statistically significant difference was proven between both of them.Keywords: positional release technique, kinesio tape, myofascial pain syndrome, Iliocostalis lumborum
Procedia PDF Downloads 2317068 A Validation Technique for Integrated Ontologies
Authors: Neli P. Zlatareva
Abstract:
Ontology validation is an important part of web applications’ development, where knowledge integration and ontological reasoning play a fundamental role. It aims to ensure the consistency and correctness of ontological knowledge and to guarantee that ontological reasoning is carried out in a meaningful way. Existing approaches to ontology validation address more or less specific validation issues, but the overall process of validating web ontologies has not been formally established yet. As the size and the number of web ontologies continue to grow, the necessity to validate and ensure their consistency and interoperability is becoming increasingly important. This paper presents a validation technique intended to test the consistency of independent ontologies utilized by a common application.Keywords: knowledge engineering, ontological reasoning, ontology validation, semantic web
Procedia PDF Downloads 3227067 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion
Authors: Doyoung Kim, Hyo Seon Park
Abstract:
Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification
Procedia PDF Downloads 4107066 Efficient Energy Management: A Novel Technique for Prolonged and Persistent Automotive Engine
Authors: Chakshu Baweja, Ishaan Prakash, Deepak Giri, Prithwish Mukherjee, Herambraj Ashok Nalawade
Abstract:
The need to prevent and control rampant and indiscriminate usage of energy in present-day realm on earth has motivated active research efforts aimed at understanding of controlling mechanisms leading to sustained energy. Although much has been done but complexity of the problem has prevented a complete understanding due to nonlinear interaction between flow, heat and mass transfer in terrestrial environment. Therefore, there is need for a systematic study to clearly understand mechanisms controlling energy-spreading phenomena to increase a system’s efficiency. The present work addresses the issue of sustaining energy and proposes a devoted technique of optimizing energy in the automotive domain. The proposed method focus on utilization of the mechanical and thermal energy of an automobile IC engine by converting and storing energy due to motion of a piston in form of electrical energy. The suggested technique utilizes piston motion of the engine to generate high potential difference capable of working as a secondary power source. This is achieved by the use of a gear mechanism and a flywheel.Keywords: internal combustion engine, energy, electromagnetic induction, efficiency, gear ratio, hybrid vehicle, engine shaft
Procedia PDF Downloads 474