Search results for: measurement accuracy
3757 Exploring Digital Media’s Impact on Sports Sponsorship: A Global Perspective
Authors: Sylvia Chan-Olmsted, Lisa-Charlotte Wolter
Abstract:
With the continuous proliferation of media platforms, there have been tremendous changes in media consumption behaviors. From the perspective of sports sponsorship, while there is now a multitude of platforms to create brand associations, the changing media landscape and shift of message control also mean that sports sponsors will have to take into account the nature of and consumer responses toward these emerging digital media to devise effective marketing strategies. Utilizing the personal interview methodology, this study is qualitative and exploratory in nature. A total of 18 experts from European and American academics, sports marketing industry, and sports leagues/teams were interviewed to address three main research questions: 1) What are the major changes in digital technologies that are relevant to sports sponsorship; 2) How have digital media influenced the channels and platforms of sports sponsorship; and 3) How have these technologies affected the goals, strategies, and measurement of sports sponsorship. The study found that sports sponsorship has moved from consumer engagement, engagement measurement, and consequences of engagement on brand behaviors to micro-targeting one on one, engagement by context, time, and space, and activation and leveraging based on tracking and databases. From the perspective of platforms and channels, the use of mobile devices is prominent during sports content consumption. Increasing multiscreen media consumption means that sports sponsors need to optimize their investment decisions in leagues, teams, or game-related content sources, as they need to go where the fans are most engaged in. The study observed an imbalanced strategic leveraging of technology and digital infrastructure. While sports leagues have had less emphasis on brand value management via technology, sports sponsors have been much more active in utilizing technologies like mobile/LBS tools, big data/user info, real-time marketing and programmatic, and social media activation. Regardless of the new media/platforms, the study found that integration and contextualization are the two essential means of improving sports sponsorship effectiveness through technology. That is, how sponsors effectively integrate social media/mobile/second screen into their existing legacy media sponsorship plan so technology works for the experience/message instead of distracting fans. Additionally, technological advancement and attention economy amplify the importance of consumer data gathering, but sports consumer data does not mean loyalty or engagement. This study also affirms the benefit of digital media as they offer viral and pre-event activations through storytelling way before the actual event, which is critical for leveraging brand association before and after. That is, sponsors now have multiple opportunities and platforms to tell stories about their brands for longer time period. In summary, digital media facilitate fan experience, access to the brand message, multiplatform/channel presentations, storytelling, and content sharing. Nevertheless, rather than focusing on technology and media, today’s sponsors need to define what they want to focus on in terms of content themes that connect with their brands and then identify the channels/platforms. The big challenge for sponsors is to play to the venues/media’s specificity and its fit with the target audience and not uniformly deliver the same message in the same format on different platforms/channels.Keywords: digital media, mobile media, social media, technology, sports sponsorship
Procedia PDF Downloads 2953756 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 1993755 Measurement of Thermal Protrusion Profile in Magnetic Recording Heads via Wyko Interferometry
Authors: Joseph Christopher R. Ragasa, Paolo Gabriel P. Casas, Nemesio S. Mangila, Maria Emma C. Villamin, Myra G. Bungag
Abstract:
A procedure in measuring the thermal protrusion profiles of magnetic recording heads was developed using a Wyko HD-8100 optical interference-based instrument. The protrusions in the heads were made by the application of a constant power through the thermal flying height controller pads. It was found that the thermally-induced bubble is confined to form in the same head locations, primarily in the reader and writer regions, regardless of the direction of approach of temperature. An application of power to the thermal flying height control pads ranging from 0 to 50 milliWatts showed that the protrusions demonstrate a linear dependence with the supplied power. The efficiencies calculated using this method were compared to that obtained through Guzik and found to be 19.57% greater due to the static testing environment used in the testing.Keywords: thermal protrusion profile, magnetic recording heads, wyko interferometry, thermal flying height control
Procedia PDF Downloads 4703754 Fault Detection and Isolation of a Three-Tank System using Analytical Temporal Redundancy, Parity Space/Relation Based Residual Generation
Authors: A. T. Kuda, J. J. Dayya, A. Jimoh
Abstract:
This paper investigates the fault detection and Isolation technique of measurement data sets from a three tank system using analytical model-based temporal redundancy which is based on residual generation using parity equations/space approach. It further briefly outlines other approaches of model-based residual generation. The basic idea of parity space residual generation in temporal redundancy is dynamic relationship between sensor outputs and actuator inputs (input-output model). These residuals where then used to detect whether or not the system is faulty and indicate the location of the fault when it is faulty. The method obtains good results by detecting and isolating faults from the considered data sets measurements generated from the system.Keywords: fault detection, fault isolation, disturbing influences, system failure, parity equation/relation, structured parity equations
Procedia PDF Downloads 3033753 Dynamic Modeling of Orthotropic Cracked Materials by X-FEM
Authors: S. Houcine Habib, B. Elkhalil Hachi, Mohamed Guesmi, Mohamed Haboussi
Abstract:
In this paper, dynamic fracture behaviors of cracked orthotropic structure are modeled using extended finite element method (X-FEM). In this approach, the finite element method model is first created and then enriched by special orthotropic crack tip enrichments and Heaviside functions in the framework of partition of unity. The mixed mode stress intensity factor (SIF) is computed using the interaction integral technique based on J-integral in order to predict cracking behavior of the structure. The developments of these procedures are programmed and introduced in a self-software platform code. To assess the accuracy of the developed code, results obtained by the proposed method are compared with those of literature.Keywords: X-FEM, composites, stress intensity factor, crack, dynamic orthotropic behavior
Procedia PDF Downloads 5723752 Tongue Image Retrieval Based Using Machine Learning
Authors: Ahmad FAROOQ, Xinfeng Zhang, Fahad Sabah, Raheem Sarwar
Abstract:
In Traditional Chinese Medicine, tongue diagnosis is a vital inspection tool (TCM). In this study, we explore the potential of machine learning in tongue diagnosis. It begins with the cataloguing of the various classifications and characteristics of the human tongue. We infer 24 kinds of tongues from the material and coating of the tongue, and we identify 21 attributes of the tongue. The next step is to apply machine learning methods to the tongue dataset. We use the Weka machine learning platform to conduct the experiment for performance analysis. The 457 instances of the tongue dataset are used to test the performance of five different machine learning methods, including SVM, Random Forests, Decision Trees, and Naive Bayes. Based on accuracy and Area under the ROC Curve, the Support Vector Machine algorithm was shown to be the most effective for tongue diagnosis (AUC).Keywords: medical imaging, image retrieval, machine learning, tongue
Procedia PDF Downloads 843751 Mobile Microscope for the Detection of Pathogenic Cells Using Image Processing
Authors: P. S. Surya Meghana, K. Lingeshwaran, C. Kannan, V. Raghavendran, C. Priya
Abstract:
One of the most basic and powerful tools in all of science and medicine is the light microscope, the fundamental device for laboratory as well as research purposes. With the improving technology, the need for portable, economic and user-friendly instruments is in high demand. The conventional microscope fails to live up to the emerging trend. Also, adequate access to healthcare is not widely available, especially in developing countries. The most basic step towards the curing of a malady is the diagnosis of the disease itself. The main aim of this paper is to diagnose Malaria with the most common device, cell phones, which prove to be the immediate solution for most of the modern day needs with the development of wireless infrastructure allowing to compute and communicate on the move. This opened up the opportunity to develop novel imaging, sensing, and diagnostics platforms using mobile phones as an underlying platform to address the global demand for accurate, sensitive, cost-effective, and field-portable measurement devices for use in remote and resource-limited settings around the world.Keywords: cellular, hand-held, health care, image processing, malarial parasites, microscope
Procedia PDF Downloads 2683750 Warning about the Risk of Blood Flow Stagnation after Transcatheter Aortic Valve Implantation
Authors: Aymen Laadhari, Gábor Székely
Abstract:
In this work, the hemodynamics in the sinuses of Valsalva after Transcatheter Aortic Valve Implantation is numerically examined. We focus on the physical results in the two-dimensional case. We use a finite element methodology based on a Lagrange multiplier technique that enables to couple the dynamics of blood flow and the leaflets’ movement. A massively parallel implementation of a monolithic and fully implicit solver allows more accuracy and significant computational savings. The elastic properties of the aortic valve are disregarded, and the numerical computations are performed under physiologically correct pressure loads. Computational results depict that blood flow may be subject to stagnation in the lower domain of the sinuses of Valsalva after Transcatheter Aortic Valve Implantation.Keywords: hemodynamics, simulations, stagnation, valve
Procedia PDF Downloads 2933749 Integration of EEG and Motion Tracking Sensors for Objective Measure of Attention-Deficit Hyperactivity Disorder in Pre-Schoolers
Authors: Neha Bhattacharyya, Soumendra Singh, Amrita Banerjee, Ria Ghosh, Oindrila Sinha, Nairit Das, Rajkumar Gayen, Somya Subhra Pal, Sahely Ganguly, Tanmoy Dasgupta, Tanusree Dasgupta, Pulak Mondal, Aniruddha Adhikari, Sharmila Sarkar, Debasish Bhattacharyya, Asim Kumar Mallick, Om Prakash Singh, Samir Kumar Pal
Abstract:
Background: We aim to develop an integrated device comprised of single-probe EEG and CCD-based motion sensors for a more objective measure of Attention-deficit Hyperactivity Disorder (ADHD). While the integrated device (MAHD) relies on the EEG signal (spectral density of beta wave) for the assessment of attention during a given structured task (painting three segments of a circle using three different colors, namely red, green and blue), the CCD sensor depicts movement pattern of the subjects engaged in a continuous performance task (CPT). A statistical analysis of the attention and movement patterns was performed, and the accuracy of the completed tasks was analysed using indigenously developed software. The device with the embedded software, called MAHD, is intended to improve certainty with criterion E (i.e. whether symptoms are better explained by another condition). Methods: We have used the EEG signal from a single-channel dry sensor placed on the frontal lobe of the head of the subjects (3-5 years old pre-schoolers). During the painting of three segments of a circle using three distinct colors (red, green, and blue), absolute power for delta and beta EEG waves from the subjects are found to be correlated with relaxation and attention/cognitive load conditions. While the relaxation condition of the subject hints at hyperactivity, a more direct CCD-based motion sensor is used to track the physical movement of the subject engaged in a continuous performance task (CPT) i.e., separation of the various colored balls from one table to another. We have used our indigenously developed software for the statistical analysis to derive a scale for the objective assessment of ADHD. We have also compared our scale with clinical ADHD evaluation. Results: In a limited clinical trial with preliminary statistical analysis, we have found a significant correlation between the objective assessment of the ADHD subjects with that of the clinician’s conventional evaluation. Conclusion: MAHD, the integrated device, is supposed to be an auxiliary tool to improve the accuracy of ADHD diagnosis by supporting greater criterion E certainty.Keywords: ADHD, CPT, EEG signal, motion sensor, psychometric test
Procedia PDF Downloads 1003748 Size-Reduction Strategies for Iris Codes
Authors: Jutta Hämmerle-Uhl, Georg Penn, Gerhard Pötzelsberger, Andreas Uhl
Abstract:
Iris codes contain bits with different entropy. This work investigates different strategies to reduce the size of iris code templates with the aim of reducing storage requirements and computational demand in the matching process. Besides simple sub-sampling schemes, also a binary multi-resolution representation as used in the JBIG hierarchical coding mode is assessed. We find that iris code template size can be reduced significantly while maintaining recognition accuracy. Besides, we propose a two stage identification approach, using small-sized iris code templates in a pre-selection satge, and full resolution templates for final identification, which shows promising recognition behaviour.Keywords: iris recognition, compact iris code, fast matching, best bits, pre-selection identification, two-stage identification
Procedia PDF Downloads 4423747 Thulium Laser Design and Experimental Verification for NIR and MIR Nonlinear Applications in Specialty Optical Fibers
Authors: Matej Komanec, Tomas Nemecek, Dmytro Suslov, Petr Chvojka, Stanislav Zvanovec
Abstract:
Nonlinear phenomena in the near- and mid-infrared region are attracting scientific attention mainly due to the supercontinuum generation possibilities and subsequent utilizations for ultra-wideband applications like e.g. absorption spectroscopy or optical coherence tomography. Thulium-based fiber lasers provide access to high-power ultrashort pump pulses in the vicinity of 2000 nm, which can be easily exploited for various nonlinear applications. The paper presents a simulation and experimental study of a pulsed thulium laser based for near-infrared (NIR) and mid-infrared (MIR) nonlinear applications in specialty optical fibers. In the first part of the paper the thulium laser is discussed. The thulium laser is based on a gain-switched seed-laser and a series of amplification stages for obtaining output peak powers in the order of kilowatts for pulses shorter than 200 ps in full-width at half-maximum. The pulsed thulium laser is first studied in a simulation software, focusing on seed-laser properties. Afterward, a pre-amplification thulium-based stage is discussed, with the focus of low-noise signal amplification, high signal gain and eliminating pulse distortions during pulse propagation in the gain medium. Following the pre-amplification stage a second gain stage is evaluated with incorporating a thulium-fiber of shorter length with increased rare-earth dopant ratio. Last a power-booster stage is analyzed, where the peak power of kilowatts should be achieved. Examples of analytical study are further validated by the experimental campaign. The simulation model is further corrected based on real components – parameters such as real insertion-losses, cross-talks, polarization dependencies, etc. are included. The second part of the paper evaluates the utilization of nonlinear phenomena, their specific features at the vicinity of 2000 nm, compared to e.g. 1550 nm, and presents supercontinuum modelling, based on the thulium laser pulsed output. Supercontinuum generation simulation is performed and provides reasonably accurate results, once fiber dispersion profile is precisely defined and fiber nonlinearity is known, furthermore input pulse shape and peak power must be known, which is assured thanks to the experimental measurement of the studied thulium pulsed laser. The supercontinuum simulation model is put in relation to designed and characterized specialty optical fibers, which are discussed in the third part of the paper. The focus is placed on silica and mainly on non-silica fibers (fluoride, chalcogenide, lead-silicate) in their conventional, microstructured or tapered variants. Parameters such as dispersion profile and nonlinearity of exploited fibers were characterized either with an accurate model, developed in COMSOL software or by direct experimental measurement to achieve even higher precision. The paper then combines all three studied topics and presents a possible application of such a thulium pulsed laser system working with specialty optical fibers.Keywords: nonlinear phenomena, specialty optical fibers, supercontinuum generation, thulium laser
Procedia PDF Downloads 3223746 Comparison of Wet and Microwave Digestion Methods for the Al, Cu, Fe, Mn, Ni, Pb and Zn Determination in Some Honey Samples by ICPOES in Turkey
Authors: Huseyin Altundag, Emel Bina, Esra Altıntıg
Abstract:
The aim of this study is determining amount of Al, Cu, Fe, Mn, Ni, Pb and Zn in the samples of honey which are gathered from Sakarya and Istanbul regions. In this study the evaluation of the trace elements in honeys samples are gathered from Sakarya and Istanbul, Turkey. The sample preparation phase is performed via wet decomposition method and microwave digestion system. The accuracy of the method was corrected by the standard reference material, Tea Leaves (INCY-TL-1) and NIST SRM 1515 Apple leaves. The comparison between gathered data and literature values has made and possible resources of the contamination to the samples of honey have handled. The obtained results will be presented in ICCIS 2015: XIII International Conference on Chemical Industry and Science.Keywords: Wet decomposition, Microwave digestion, Trace element, Honey, ICP-OES
Procedia PDF Downloads 4663745 The Monitor for Neutron Dose in Hadrontherapy Project: Secondary Neutron Measurement in Particle Therapy
Authors: V. Giacometti, R. Mirabelli, V. Patera, D. Pinci, A. Sarti, A. Sciubba, G. Traini, M. Marafini
Abstract:
The particle therapy (PT) is a very modern technique of non invasive radiotherapy mainly devoted to the treatment of tumours untreatable with surgery or conventional radiotherapy, because localised closely to organ at risk (OaR). Nowadays, PT is available in about 55 centres in the word and only the 20\% of them are able to treat with carbon ion beam. However, the efficiency of the ion-beam treatments is so impressive that many new centres are in construction. The interest in this powerful technology lies to the main characteristic of PT: the high irradiation precision and conformity of the dose released to the tumour with the simultaneous preservation of the adjacent healthy tissue. However, the beam interactions with the patient produce a large component of secondary particles whose additional dose has to be taken into account during the definition of the treatment planning. Despite, the largest fraction of the dose is released to the tumour volume, a non-negligible amount is deposed in other body regions, mainly due to the scattering and nuclear interactions of the neutrons within the patient body. One of the main concerns in PT treatments is the possible occurrence of secondary malignant neoplasm (SMN). While SMNs can be developed up to decades after the treatments, their incidence impacts directly life quality of the cancer survivors, in particular in pediatric patients. Dedicated Treatment Planning Systems (TPS) are used to predict the normal tissue toxicity including the risk of late complications induced by the additional dose released by secondary neutrons. However, no precise measurement of secondary neutrons flux is available, as well as their energy and angular distributions: an accurate characterization is needed in order to improve TPS and reduce safety margins. The project MONDO (MOnitor for Neutron Dose in hadrOntherapy) is devoted to the construction of a secondary neutron tracker tailored to the characterization of that secondary neutron component. The detector, based on the tracking of the recoil protons produced in double-elastic scattering interactions, is a matrix of thin scintillating fibres, arranged in layer x-y oriented. The final size of the object is 10 x 10 x 20 cm3 (squared 250µm scint. fibres, double cladding). The readout of the fibres is carried out with a dedicated SPAD Array Sensor (SBAM) realised in CMOS technology by FBK (Fondazione Bruno Kessler). The detector is under development as well as the SBAM sensor and it is expected to be fully constructed for the end of the year. MONDO will make data tacking campaigns at the TIFPA Proton Therapy Center of Trento, at the CNAO (Pavia) and at HIT (Heidelberg) with carbon ion in order to characterize the neutron component and predict the additional dose delivered on the patients with much more precision and to drastically reduce the actual safety margins. Preliminary measurements with charged particles beams and MonteCarlo FLUKA simulation will be presented.Keywords: secondary neutrons, particle therapy, tracking detector, elastic scattering
Procedia PDF Downloads 2263744 Municipal-Level Gender Norms: Measurement and Effects on Women in Politics
Authors: Luisa Carrer, Lorenzo De Masi
Abstract:
In this paper, we exploit the massive amount of information from Facebook to build a measure of gender attitudes in Italy at a previously impossible resolution—the municipal level. We construct our index via a machine learning method to replicate a benchmark region-level measure. Interestingly, we find that most of the variation in our Gender Norms Index (GNI) is across towns within narrowly defined geographical areas rather than across regions or provinces. In a second step, we show how this local variation in norms can be leveraged for identification purposes. In particular, we use our index to investigate whether these differences in norms carry over to the policy activity of politicians elected in the Italian Parliament. We document that females are more likely to sit in parliamentary committees focused on gender-sensitive matters, labor, and social issues, but not if they come from a relatively conservative town. These effects are robust to conditioning the legislative term and electoral district, suggesting the importance of social norms in shaping legislators’ policy activity.Keywords: gender equality, gender norms index, Facebook, machine learning, politics
Procedia PDF Downloads 803743 Applicability of the Rapid Estimate of Adult Health Literacy in Medicine (Short Form) among Patients in Dakshina Kannada District, Karnataka, India
Authors: U. P. Rathnakar, Medha Urval, K. Ashok Shenoy
Abstract:
Introduction: There are many tools available for the measurement of health literacy. REALM (Rapid Estimate of Adult Literacy in Medicine) is a very commonly used tool in advanced countries. It comes in two forms-one with 66 words and shorter version (REALM-SF) with seven words. We decided to test the applicability of shorter version of the REALM test among our patients. Methodology: REALM (SF) was tested among 200 patients in a tertiary hospital. Discussion and conclusion: From the analysis of results, when the results of pronunciation indicate adequate levels of HL skills, analysis of comprehension shows that mere reading skills is likely to be misleading. So it is proposed that in Indian population who have adequate reading skills without adequate comprehension the REALM-SF test tool in its present form may not be an ideal testing tool for assessing HL.Keywords: health literacy, REALM, short form, India
Procedia PDF Downloads 4693742 Kannudi- A Reference Editor for Kannada (Based on OPOK! and OHOK! Principles, and Domain Knowledge)
Authors: Vishweshwar V. Dixit
Abstract:
Kannudi is a reference editor introducing a method of input for Kannada, called OHOK!, that is, Ottu Hāku Ottu Koḍu!. This is especially suited for pressure-sensitive input devices, though the current online implementation uses the regular mechanical keyboard. OHOK! has three possible modes, namely, sva-ottu (self-conjunct), kandante (as you see), and andante (as you say). It may be noted that kandante mode does not follow the phonetic order. However, this model may work well for those who are inclined to visualize as they type rather than vocalize the sounds. Kannudi also demonstrates how domain knowledge can be effectively used to potentially increase speed, accuracy, and user-friendliness. For example, selection of a default vowel, automatic shunyification, and arkification. Also implemented are four types of Deletes that are necessary for phono-syllabic languages like Kannada.Keywords: kannada, conjunct, reference editor, pressure input
Procedia PDF Downloads 953741 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit
Authors: Davit Mirzoyan, Ararat Khachatryan
Abstract:
A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.Keywords: detection, monitoring, process corner, process variation
Procedia PDF Downloads 5263740 The Traditional Ceramics Value in the Middle East
Authors: Abdelmessih Malak Sadek Labib
Abstract:
Ceramic materials are known for their stability in harsh environments and excellent electrical, mechanical, and thermal properties. They have been widely used in various applications despite the emergence of new materials such as plastics and composites. However, ceramics are often brittle, which can lead to catastrophic failure. The fragility of ceramics and the mechanisms behind their failure have been a topic of extensive research, particularly in load-bearing applications like veneers. Porcelain, a type of traditional pottery, is commonly used in such applications. Traditional pottery consists of clay, silica, and feldspar, and the presence of quartz in the ceramic body can lead to microcracks and stress concentrations. The mullite hypothesis suggests that the strength of porcelain can be improved by increasing the interlocking of mullite needles in the ceramic body. However, there is a lack of reports on Young's moduli in the literature, leading to erroneous conclusions about the mechanical behavior of porcelain. This project aims to investigate the role of quartz and mullite on the mechanical strength of various porcelains while considering factors such as particle size, flexural strength, and fractographic forces. Research Aim: The aim of this research project is to assess the role of quartz and mullite in enhancing the mechanical strength of different porcelains. The project will also explore the effect of reducing particle size on the properties of porcelain, as well as investigate flexural strength and fractographic techniques. Methodology: The methodology for this project involves using scientific expressions and a mix of modern English to ensure the understanding of all attendees. It will include the measurement of Young's modulus and the evaluation of the mechanical behavior of porcelains through various experimental techniques. Findings: The findings of this study will provide a realistic assessment of the role of quartz and mullite in strengthening and reducing the fragility of porcelain. The research will also contribute to a better understanding of the mechanical behavior of ceramics, specifically in load-bearing applications. Theoretical Importance: The theoretical importance of this research lies in its contribution to the understanding of the factors influencing the mechanical strength and fragility of ceramics, particularly porcelain. By investigating the interplay between quartz, mullite, and other variables, this study will enhance our knowledge of the properties and behavior of traditional ceramics. Data Collection and Analysis Procedures: Data for this research will be collected through experiments involving the measurement of Young's modulus and other mechanical properties of porcelains. The effects of quartz, mullite, particle size, flexural strength, and fractographic forces will be examined and analyzed using appropriate statistical techniques and fractographic analysis. Questions Addressed: This research project aims to address the following questions: (1) How does the presence of quartz and mullite affect the mechanical strength of porcelain? (2) What is the impact of reducing particle size on the properties of porcelain? (3) How do flexural strength and fractographic forces influence the behavior of porcelains? Conclusion: In conclusion, this research project aims to enhance the understanding of the role of quartz and mullite in strengthening and reducing the fragility of porcelain. By investigating the mechanical properties of porcelains and considering factors such as particle size, flexural strength, and fractographic forces, this study will contribute to the knowledge of traditional ceramics and their potential applications. The findings will have practical implications for the use of ceramics in various fields.Keywords: stability, harsh environments, electrical, techniques, mechanical disadvantages, materials
Procedia PDF Downloads 693739 A Comprehensive Finite Element Model for Incremental Launching of Bridges: Optimizing Construction and Design
Authors: Mohammad Bagher Anvari, Arman Shojaei
Abstract:
Incremental launching, a widely adopted bridge erection technique, offers numerous advantages for bridge designers. However, accurately simulating and modeling the dynamic behavior of the bridge during each step of the launching process proves to be tedious and time-consuming. The perpetual variation of internal forces within the deck during construction stages adds complexity, exacerbated further by considerations of other load cases, such as support settlements and temperature effects. As a result, there is an urgent need for a reliable, simple, economical, and fast algorithmic solution to model bridge construction stages effectively. This paper presents a novel Finite Element (FE) model that focuses on studying the static behavior of bridges during the launching process. Additionally, a simple method is introduced to normalize all quantities in the problem. The new FE model overcomes the limitations of previous models, enabling the simulation of all stages of launching, which conventional models fail to achieve due to underlying assumptions. By leveraging the results obtained from the new FE model, this study proposes solutions to improve the accuracy of conventional models, particularly for the initial stages of bridge construction that have been neglected in previous research. The research highlights the critical role played by the first span of the bridge during the initial stages, a factor often overlooked in existing studies. Furthermore, a new and simplified model termed the "semi-infinite beam" model, is developed to address this oversight. By utilizing this model alongside a simple optimization approach, optimal values for launching nose specifications are derived. The practical applications of this study extend to optimizing the nose-deck system of incrementally launched bridges, providing valuable insights for practical usage. In conclusion, this paper introduces a comprehensive Finite Element model for studying the static behavior of bridges during incremental launching. The proposed model addresses limitations found in previous approaches and offers practical solutions to enhance accuracy. The study emphasizes the importance of considering the initial stages and introduces the "semi-infinite beam" model. Through the developed model and optimization approach, optimal specifications for launching nose configurations are determined. This research holds significant practical implications and contributes to the optimization of incrementally launched bridges, benefiting both the construction industry and bridge designers.Keywords: incremental launching, bridge construction, finite element model, optimization
Procedia PDF Downloads 1063738 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 5043737 Study on Constitutive Model of Particle Filling Material Considering Volume Expansion
Authors: Xu Jinsheng, Tong Xin, Zheng Jian, Zhou Changsheng
Abstract:
The NEPE (nitrate ester plasticized polyether) propellant is a kind of particle filling material with relatively high filling fraction. The experimental results show that the microcracks, microvoids and dewetting can cause the stress softening of the material. In this paper, a series of mechanical testing in inclusion with CCD technique were conducted to analyze the evolution of internal defects of propellant. The volume expansion function of the particle filling material was established by measuring of longitudinal and transverse strain with optical deformation measurement system. By analyzing the defects and internal damages of the material, a visco-hyperelastic constitutive model based on free energy theory was proposed incorporating damage function. The proposed constitutive model could accurately predict the mechanical properties of uniaxial tensile tests and tensile-relaxation tests.Keywords: dewetting, constitutive model, uniaxial tensile tests, visco-hyperelastic, nonlinear
Procedia PDF Downloads 3043736 Getting to Know the Enemy: Utilization of Phone Record Analysis Simulations to Uncover a Target’s Personal Life Attributes
Authors: David S. Byrne
Abstract:
The purpose of this paper is to understand how phone record analysis can enable identification of subjects in communication with a target of a terrorist plot. This study also sought to understand the advantages of the implementation of simulations to develop the skills of future intelligence analysts to enhance national security. Through the examination of phone reports which in essence consist of the call traffic of incoming and outgoing numbers (and not by listening to calls or reading the content of text messages), patterns can be uncovered that point toward members of a criminal group and activities planned. Through temporal and frequency analysis, conclusions were drawn to offer insights into the identity of participants and the potential scheme being undertaken. The challenge lies in the accurate identification of the users of the phones in contact with the target. Often investigators rely on proprietary databases and open sources to accomplish this task, however it is difficult to ascertain the accuracy of the information found. Thus, this paper poses two research questions: how effective are freely available web sources of information at determining the actual identification of callers? Secondly, does the identity of the callers enable an understanding of the lifestyle and habits of the target? The methodology for this research consisted of the analysis of the call detail records of the author’s personal phone activity spanning the period of a year combined with a hypothetical theory that the owner of said phone was a leader of terrorist cell. The goal was to reveal the identity of his accomplices and understand how his personal attributes can further paint a picture of the target’s intentions. The results of the study were interesting, nearly 80% of the calls were identified with over a 75% accuracy rating via datamining of open sources. The suspected terrorist’s inner circle was recognized including relatives and potential collaborators as well as financial institutions [money laundering], restaurants [meetings], a sporting goods store [purchase of supplies], and airline and hotels [travel itinerary]. The outcome of this research showed the benefits of cellphone analysis without more intrusive and time-consuming methodologies though it may be instrumental for potential surveillance, interviews, and developing probable cause for wiretaps. Furthermore, this research highlights the importance of building upon the skills of future intelligence analysts through phone record analysis via simulations; that hands-on learning in this case study emphasizes the development of the competencies necessary to improve investigations overall.Keywords: hands-on learning, intelligence analysis, intelligence education, phone record analysis, simulations
Procedia PDF Downloads 173735 A Firefly Based Optimization Technique for Optimal Planning of Voltage Controlled Distributed Generators
Authors: M. M. Othman, Walid El-Khattam, Y. G. Hegazy, A. Y. Abdelaziz
Abstract:
This paper presents a method for finding the optimal location and capacity of dispatchable DGs connected to the distribution feeders for optimal planning for a specified power loss without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37-nodes feeder. The results that are validated by comparing it with results obtained from other competing methods show the effectiveness, accuracy and speed of the proposed method.Keywords: distributed generators, firefly technique, optimization, power loss
Procedia PDF Downloads 5353734 Wind Speed Prediction Using Passive Aggregation Artificial Intelligence Model
Authors: Tarek Aboueldahab, Amin Mohamed Nassar
Abstract:
Wind energy is a fluctuating energy source unlike conventional power plants, thus, it is necessary to accurately predict short term wind speed to integrate wind energy in the electricity supply structure. To do so, we present a hybrid artificial intelligence model of short term wind speed prediction based on passive aggregation of the particle swarm optimization and neural networks. As a result, improvement of the prediction accuracy is obviously obtained compared to the standard artificial intelligence method.Keywords: artificial intelligence, neural networks, particle swarm optimization, passive aggregation, wind speed prediction
Procedia PDF Downloads 4533733 Evaluation of Mango Seed Extract as Surfactant for Enhanced Oil Recovery
Authors: Ezzaddin Rashid Hussein
Abstract:
This research investigates the viability of mango seed extract (MSE) using a surfactant to improve oil recovery (EOR). This research examines MSE-based surfactant solutions and compares them to more traditional synthetic surfactants in terms of phase behaviour and interfacial tension. The phase behaviour and interfacial tension of five samples of surfactant solutions with different concentrations were measured. Samples 1 (2.0 g) and 1 (1.5 g) performed closest to the critical micelle concentration (CMC) and displayed the greatest decrease in surface tension, according to the results. In addition, the measurement of IFT, contact angle, and pH, as well as comparison with prior research, highlights the potential environmental benefits of MSMEs as an eco-friendly alternative. It is recommended that additional research be conducted to assess their stability and behaviour under reservoir conditions. Overall, mango seed extract demonstrates promise as a natural and sustainable surfactant for enhancing oil recovery, paving the way for eco-friendly enhanced oil recovery techniques.Keywords: oil and gas, mango seed powder, surfactants, enhanced oil recovery, interfacial tension IFT, wettability, contacts angle, phase behavior, pH
Procedia PDF Downloads 833732 Open Jet Testing for Buoyant and Hybrid Buoyant Aerial Vehicles
Authors: A. U. Haque, W. Asrar, A. A. Omar, E. Sulaeman, J. S Mohamed Ali
Abstract:
Open jet testing is a valuable testing technique which provides the desired results with reasonable accuracy. It has been used in past for the airships and now has recently been applied for the hybrid ones, having more non-buoyant force coming from the wings, empennage and the fuselage. In the present review work, an effort has been done to review the challenges involved in open jet testing. In order to shed light on the application of this technique, the experimental results of two different configurations are presented. Although, the aerodynamic results of such vehicles are unique to its own design; however, it will provide a starting point for planning any future testing. Few important testing areas which need more attention are also highlighted. Most of the hybrid buoyant aerial vehicles are unconventional in shape and there experimental data is generated, which is unique to its own design.Keywords: open jet testing, aerodynamics, hybrid buoyant aerial vehicles, airships
Procedia PDF Downloads 5733731 Thermal, Chemical, and Mineralogical Properties of Soil Building Blocks Reinforced with Cement
Authors: Abdelmalek Ammari
Abstract:
This paper represents an experimental study to determine the effect between thermal conductivity of Compressed Earth Block Stabilized (CEBs) by cement and the mineralogical and chemical analyses of soil, all the samples of CEB in the dry state and with different content of cement, the samples made by soil stabilized by Portland Cement. The soil used collected from fez city in Morocco. That determination of the thermal conductivity of CEBs plays an important role when considering its suitability for energy saving insulation. The measurement technique used to determine thermal conductivity is called hot ring method, the thermal conductivity of the tested samples is strongly affected by the quantity of the cement added. The soil of Fez, mainly composed of calcite, quartz, and dolomite, improved the behaviour of the material by the addition of cement. The findings suggest that to manufacture lightweight samples with high thermal insulation properties, it is advisable to use clays that contain quartz. . In addition, quartz has high thermal conductivity.Keywords: compressed earth blocks, thermal conductivity, mineralogical, chemical, temperature
Procedia PDF Downloads 1563730 Emperical Correlation for Measurement of Thermal Diffusivity of Spherical Shaped Food Products under Forced Convection Environment
Authors: M. Riaz, Inamur Rehman, Abhishek Sharma
Abstract:
The present work is the development of an experimental method for determining the thermal diffusivity variations with temperature of selected regular shaped solid fruits and vegetables subjected to forced convection cooling. Experimental investigations were carried on the sample chosen (potato and brinjal), which is approximately of spherical geometry. The variation of temperature within the food product is measured at several locations from centre to skin, under forced convection environment using a deep freezer, maintained at -10°C.This method uses one dimensional Fourier equation applied to regular shapes. For this, the experimental temperature data obtained from cylindrical and spherical shaped products during pre-cooling was utilised. Such temperature and thermal diffusivity profiles can be readily used with other information such as degradation rate, etc. to evaluate thermal treatments based on cold air cooling methods for storage of perishable food products.Keywords: thermal diffusivity, skin temperature, precooling, forced convection, regular shaped
Procedia PDF Downloads 4613729 Myanmar Consonants Recognition System Based on Lip Movements Using Active Contour Model
Authors: T. Thein, S. Kalyar Myo
Abstract:
Human uses visual information for understanding the speech contents in noisy conditions or in situations where the audio signal is not available. The primary advantage of visual information is that it is not affected by the acoustic noise and cross talk among speakers. Using visual information from the lip movements can improve the accuracy and robustness of automatic speech recognition. However, a major challenge with most automatic lip reading system is to find a robust and efficient method for extracting the linguistically relevant speech information from a lip image sequence. This is a difficult task due to variation caused by different speakers, illumination, camera setting and the inherent low luminance and chrominance contrast between lip and non-lip region. Several researchers have been developing methods to overcome these problems; the one is lip reading. Moreover, it is well known that visual information about speech through lip reading is very useful for human speech recognition system. Lip reading is the technique of a comprehensive understanding of underlying speech by processing on the movement of lips. Therefore, lip reading system is one of the different supportive technologies for hearing impaired or elderly people, and it is an active research area. The need for lip reading system is ever increasing for every language. This research aims to develop a visual teaching method system for the hearing impaired persons in Myanmar, how to pronounce words precisely by identifying the features of lip movement. The proposed research will work a lip reading system for Myanmar Consonants, one syllable consonants (င (Nga)၊ ည (Nya)၊ မ (Ma)၊ လ (La)၊ ၀ (Wa)၊ သ (Tha)၊ ဟ (Ha)၊ အ (Ah) ) and two syllable consonants ( က(Ka Gyi)၊ ခ (Kha Gway)၊ ဂ (Ga Nge)၊ ဃ (Ga Gyi)၊ စ (Sa Lone)၊ ဆ (Sa Lain)၊ ဇ (Za Gwe) ၊ ဒ (Da Dway)၊ ဏ (Na Gyi)၊ န (Na Nge)၊ ပ (Pa Saug)၊ ဘ (Ba Gone)၊ ရ (Ya Gaug)၊ ဠ (La Gyi) ). In the proposed system, there are three subsystems, the first one is the lip localization system, which localizes the lips in the digital inputs. The next one is the feature extraction system, which extracts features of lip movement suitable for visual speech recognition. And the final one is the classification system. In the proposed research, Two Dimensional Discrete Cosine Transform (2D-DCT) and Linear Discriminant Analysis (LDA) with Active Contour Model (ACM) will be used for lip movement features extraction. Support Vector Machine (SVM) classifier is used for finding class parameter and class number in training set and testing set. Then, experiments will be carried out for the recognition accuracy of Myanmar consonants using the only visual information on lip movements which are useful for visual speech of Myanmar languages. The result will show the effectiveness of the lip movement recognition for Myanmar Consonants. This system will help the hearing impaired persons to use as the language learning application. This system can also be useful for normal hearing persons in noisy environments or conditions where they can find out what was said by other people without hearing voice.Keywords: feature extraction, lip reading, lip localization, Active Contour Model (ACM), Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), Two Dimensional Discrete Cosine Transform (2D-DCT)
Procedia PDF Downloads 2863728 A Variable Structural Control for a Flexible Lamina
Authors: Xuezhang Hou
Abstract:
A control problem of a flexible Lamina formulated by partial differential equations with viscoelastic boundary conditions is studied in this paper. The problem is written in standard form of linear infinite dimensional system in an appropriate energy Hilbert space. The semigroup approach of linear operators is adopted in investigating wellposedness of the closed loop system. A variable structural control for the system is proposed, and meanwhile an equivalent control method is applied to the thin plate system. A significant result on control theory that the thin plate can be approximated by ideal sliding mode in any accuracy in terms of semigroup approach is obtained.Keywords: partial differential equations, flexible lamina, variable structural control, semigroup of linear operators
Procedia PDF Downloads 88