Search results for: analytical description
2793 Experimental Modal Analysis of Kursuncular Minaret
Authors: Yunus Dere
Abstract:
Minarets are tower like structures where the call to prayer of Muslims is performed. They have a symbolic meaning and sacred place among Muslims. Being tall and slender, they are prone to damage under earthquakes and strong winds. Kursuncular stone minaret was built around thirty years ago in Konya/TURKEY. Its core and helical stairs are made of reinforced concrete. Its stone spire was damaged during a light earthquake. Its spire is later replaced with a light material covered with lead sheets. In this study, the natural frequencies and mode shapes of Kursuncular minaret is obtained experimentally and analytically. First an ambient vibration test is carried out using a data acquisition system with accelerometers located at four locations along the height of the minaret. The collected vibration data is evaluated by operational modal analysis techniques. For the analytical part of the study, the dimensions of the minaret are accurately measured and a detailed 3D solid finite element model of the minaret is generated. The moduli of elasticity of the stone and concrete are approximated using the compressive strengths obtained by Windsor Pin tests. Finite element modal analysis of the minaret is carried out to get the modal parameters. Experimental and analytical results are then compared and found in good agreement.Keywords: experimental modal analysis, stone minaret, finite element modal analysis, minarets
Procedia PDF Downloads 3262792 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions
Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri
Abstract:
Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics
Procedia PDF Downloads 1852791 Characterization of the Physicochemical Properties of Raw and Calcined Kaolinitic Clays Using Analytical Techniques
Authors: Alireza Khaloo, Asghar Gholizadeh-Vayghan
Abstract:
The present work focuses on the characterization of the physicochemical properties of kaolinitic clays in both raw and calcined (i.e., dehydroxylated) states. The properties investigated included the dehydroxylation temperature, chemical composition and crystalline phases, band types, kaolinite content, vitreous phase, and reactive and unreactive silica and alumina. The thermogravimetric analysis, X-ray diffractometry and infrared spectroscopy results suggest that full dehydroxylation takes place at 639°C, converting kaolinite to reactive metakaolinite (Si₂Al₂O₇). Application of higher temperatures up to 800 °C leads to complete decarbonation of the calcite phase, and the kaolinite converts to mullite at temperatures exceeding 957 °C. Calcination at 639°C was found to cause a 50% increase in the vitreous content of kaolin. Statistically meaningful increases in the reactivity of silica, alumina, calcite and sodium carbonate in kaolin were detected as a result of such thermal treatment. Such increases were found to be 11%, 47%, 240% and 10%, respectively. The ferrite phase, however, showed a 36% decline in reactivity. The proposed approach can be used as an analytical method to determine the viability of the source of kaolinite and proper physical and chemical modifications needed to enhance its suitability for geopolymer production.Keywords: physicochemical properties, dehydroxylation, kaolinitic clays, kaolinite content, vitreous phase, reactivity
Procedia PDF Downloads 1622790 Solid Waste Disposal Site Selection in Thiruvananthapuram Corporation Area by Data Analysis Using GIS and Remote Sensing Tools
Authors: C. Asha Poorna, P. G. Vinod, A. R. R. Menon
Abstract:
Currently increasing population and their activities like urbanization and industrialization generating the greatest environmental, issue called Waste. And the major problem in waste management is selection of an appropriate site for waste disposal. The selection of suitable site have constrains like environmental, economical and political considerations. In this paper we discuss the strategies to be followed while selecting a site for decentralized system for solid waste disposal, using Geographic Information System (GIS), the Analytical Hierarchy Process (AHP) and the remote sensing method for Thiruvananthapuram corporation area. It is located on the west coast of India near the extreme south of the mainland. It lies on the shores of Killiyar and Karamana River. Being on the basin the waste managements must be regulated with the water body. The different criteria considered for waste disposal site selection are lithology, surface water, aquifer, groundwater, land use, contours, aspect, elevation, slope, and distance to road, distance from settlement are examined in relation to land fill site selection. Each criterion was identified and weighted by AHP score and mapped using GIS technique and suitable map is prepared by overlay analysis.Keywords: waste disposal, solid waste management, Geographic Information System (GIS), Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 3962789 The Use of Religious Symbols in the Workplace: Remarks on the Latest Case Law
Authors: Susana Sousa Machado
Abstract:
The debate on the use of religious symbols has been highlighted in modern societies, especially in the field of labour relationships. As litigiousness appears to be growing, the matter requires a careful study from a legal perspective. In this context, a description and critical analysis of the most recent case law is conducted regarding the use of symbols by the employee in the workplace, delivered both by the European Court of Human Rights and by the Court of Justice of the European Union. From this comparative analysis we highlight the most relevant aspects in order to seek a common core regarding the juridical-argumentative approach of case law.Keywords: religion, religious symbols, workplace, discrimination
Procedia PDF Downloads 4202788 Materials and Techniques of Anonymous Egyptian Polychrome Cartonnage Mummy Mask: A Multiple Analytical Study
Authors: Hanaa A. Al-Gaoudi, Hassan Ebeid
Abstract:
The research investigates the materials and processes used in the manufacturing of an Egyptian polychrome cartonnage mummy mask with the aim of dating this object and establishing trade patterns of certain materials that were used and available at the time of ancient Egypt. This anonymous-source object was held in the basement storage of the Egyptian Museum in Cairo (EMC) and has never been on display. Furthermore, there is no information available regarding its owner, provenance, date, and even the time of its possession by the museum. Moreover, the object is in a very poor condition where almost two-thirds of the mask was bent and has never received any previous conservation treatment. This research has utilized well-established multi-analytical methods to identify the considerable diversity of materials that have been used in the manufacturing of this object. These methods include Computed Tomography Scan (CT scan) to acquire detailed pictures of the inside physical structure and condition of the bended layers. Dino-Lite portable digital microscope, scanning electron microscopy with energy dispersive X-ray spectrometer (SEM-EDX), and the non-invasive imaging technique of multispectral imaging (MSI) to obtain information about the physical characteristics and condition of the painted layers and to examine the microstructure of the materials. Portable XRF Spectrometer (PXRF) and X-Ray powder diffraction (XRD) to identify mineral phases and the bulk element composition in the gilded layer, ground, and pigments; Fourier-transform infrared (FTIR) to identify organic compounds and their molecular characterization; accelerator mass spectrometry (AMS 14C) to date the object. Preliminary results suggest that there are no human remains inside the object, and the textile support is linen fibres with tabby weave 1/1 and these fibres are in a very bad condition. Several pigments have been identified, such as Egyptian blue, Magnetite, Egyptian green frit, Hematite, Calcite, and Cinnabar; moreover, the gilded layers are pure gold and the binding media in the pigments is Arabic gum and animal glue in the textile support layer.Keywords: analytical methods, Egyptian museum, mummy mask, pigments, textile
Procedia PDF Downloads 1242787 An Implementation of a Dual-Spin Spacecraft Attitude Reorientation Using Properties of Its Chaotic Motion
Authors: Anton V. Doroshin
Abstract:
This article contains a description of main ideas for the attitude reorientation of spacecraft (small dual-spin spacecraft, nanosatellites) using properties of its chaotic attitude motion under the action of internal perturbations. The considering method based on intentional initiations of chaotic modes of attitude motion with big amplitudes of the nutation oscillations, and also on the redistributions of the angular momentum between coaxial bodies of the dual-spin spacecraft (DSSC), which perform in the purpose of system’s phase space changing.Keywords: spacecraft, attitude dynamics, control, chaos
Procedia PDF Downloads 3972786 150 KVA Multifunction Laboratory Test Unit Based on Power-Frequency Converter
Authors: Bartosz Kedra, Robert Malkowski
Abstract:
This paper provides description and presentation of laboratory test unit built basing on 150 kVA power frequency converter and Simulink RealTime platform. Assumptions, based on criteria which load and generator types may be simulated using discussed device, are presented, as well as control algorithm structure. As laboratory setup contains transformer with thyristor controlled tap changer, a wider scope of setup capabilities is presented. Information about used communication interface, data maintenance, and storage solution as well as used Simulink real-time features is presented. List and description of all measurements are provided. Potential of laboratory setup modifications is evaluated. For purposes of Rapid Control Prototyping, a dedicated environment was used Simulink RealTime. Therefore, load model Functional Unit Controller is based on a PC computer with I/O cards and Simulink RealTime software. Simulink RealTime was used to create real-time applications directly from Simulink models. In the next step, applications were loaded on a target computer connected to physical devices that provided opportunity to perform Hardware in the Loop (HIL) tests, as well as the mentioned Rapid Control Prototyping process. With Simulink RealTime, Simulink models were extended with I/O cards driver blocks that made automatic generation of real-time applications and performing interactive or automated runs on a dedicated target computer equipped with a real-time kernel, multicore CPU, and I/O cards possible. Results of performed laboratory tests are presented. Different load configurations are described and experimental results are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area and arbitrary active and reactive power regulation basing on defined schedule.Keywords: MATLAB, power converter, Simulink Real-Time, thyristor-controlled tap changer
Procedia PDF Downloads 3232785 Investigation of the Litho-Structure of Ilesa Using High Resolution Aeromagnetic Data
Authors: Oladejo Olagoke Peter, Adagunodo T. A., Ogunkoya C. O.
Abstract:
The research investigated the arrangement of some geological features under Ilesa employing aeromagnetic data. The obtained data was subjected to various data filtering and processing techniques, which are Total Horizontal Derivative (THD), Depth Continuation and Analytical Signal Amplitude using Geosoft Oasis Montaj 6.4.2 software. The Reduced to the Equator –Total Magnetic Intensity (TRE-TMI) outcomes reveal significant magnetic anomalies, with high magnitude (55.1 to 155 nT) predominantly at the Northwest half of the area. Intermediate magnetic susceptibility, ranging between 6.0 to 55.1 nT, dominates the eastern part, separated by depressions and uplifts. The southern part of the area exhibits a magnetic field of low intensity, ranging from -76.6 to 6.0 nT. The lineaments exhibit varying lengths ranging from 2.5 and 16.0 km. Analyzing the Rose Diagram and the analytical signal amplitude indicates structural styles mainly of E-W and NE-SW orientations, particularly evident in the western, SW and NE regions with an amplitude of 0.0318nT/m. The identified faults in the area demonstrate orientations of NNW-SSE, NNE-SSW and WNW-ESE, situated at depths ranging from 500 to 750 m. Considering the divergence magnetic susceptibility, structural style or orientation of the lineaments, identified fault and their depth, these lithological features could serve as a valuable foundation for assessing ground motion, particularly in the presence of sufficient seismic energy.Keywords: lineament, aeromagnetic, anomaly, fault, magnetic
Procedia PDF Downloads 752784 Universality as Opportunity Domain behind the Threats and Challenges of Natural Disasters
Authors: Kunto Wibowo Agung Prodjonoto
Abstract:
Occasionally, opportunities occur not due to chances but threats. This, however, is often not realized because a greater threat is perceived to be anything that threatens, endangers, or harms, resulting in bad impacts that are also part of the risk and consequence. As a result, more focus tends to direct towards the bad impacts. Risk, in this case, shall be seen rather as something challenging, which can turn to be an opportunity to tackle an obstacle. Therefore, it does not seem exaggerating if later, risk can be considered as a challenge that presents an opportunity. So as in the context of the threat of natural disasters which gives an idea that opportunities exist. Nature referred to in a fashion as 'natural disasters' captured an expression to picture the 'threats' aspect, which instructively implying a chance of opportunity. This is quite logical, as SWOT (strengths, weaknesses, opportunities, threats) analysis can evaluate the situation at hand related to the analysis of various factors in formulating strategies to deal with natural disaster situations. The analytical method created by Albert Humphrey is indeed not an analytical tool to provide solutions, but certainly 'opportunities and challenges' are discussed therein on a vertical line, where opportunities are posited on the positive axis, and threats are posed on the negative axis. Observing this dynamism, the challenges and threats of disasters are having opportunity relevance to moralizing opportunities, that by quality poses universalism populist characteristics, universalism characteristics, and regional characteristics. Here, universalism appears as an opportunity domain underneath the threats and challenges of natural disasters.Keywords: universality, opportunities, threats, challenges of natural disasters
Procedia PDF Downloads 1512783 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance
Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan
Abstract:
Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 1662782 A Mathematical Analysis of Behavioural Epidemiology: Drugs Users Transmission Dynamics Based on Level Education for Susceptible Population
Authors: Firman Riyudha, Endrik Mifta Shaiful
Abstract:
The spread of drug users is one kind of behavioral epidemiology that becomes a threat to every country in the world. This problem caused various crisis simultaneously, including financial or economic crisis, social, health, until human crisis. Most drug users are teenagers at school age. A new deterministic model would be constructed to determine the dynamics of the spread of drug users by considering level of education in a susceptible population. Based on the analytical model, two equilibria points were obtained; there were E₀ (zero user) and E₁ (endemic equilibrium). Existence of equilibrium and local stability of equilibria depended on the Basic Reproduction Ratio (R₀). This parameter was defined as the expected rate of secondary prevalence and primary prevalence in virgin population along spreading primary prevalence. The zero-victim equilibrium would be locally asymptotically stable if R₀ < 1 while if R₀ > 1 the endemic equilibrium would be locally asymptotically stable. The result showed that R₀ was proportional to the rate of interaction of each susceptible population based on educational level with the users' population. It is concluded that there was a need to be given a control in interaction, so that drug users population could be minimized. Numerical simulations were also provided to support analytical results.Keywords: drugs users, level education, mathematical model, stability
Procedia PDF Downloads 4752781 Deep Learning Based Text to Image Synthesis for Accurate Facial Composites in Criminal Investigations
Authors: Zhao Gao, Eran Edirisinghe
Abstract:
The production of an accurate sketch of a suspect based on a verbal description obtained from a witness is an essential task for most criminal investigations. The criminal investigation system employs specifically trained professional artists to manually draw a facial image of the suspect according to the descriptions of an eyewitness for subsequent identification. Within the advancement of Deep Learning, Recurrent Neural Networks (RNN) have shown great promise in Natural Language Processing (NLP) tasks. Additionally, Generative Adversarial Networks (GAN) have also proven to be very effective in image generation. In this study, a trained GAN conditioned on textual features such as keywords automatically encoded from a verbal description of a human face using an RNN is used to generate photo-realistic facial images for criminal investigations. The intention of the proposed system is to map corresponding features into text generated from verbal descriptions. With this, it becomes possible to generate many reasonably accurate alternatives to which the witness can use to hopefully identify a suspect from. This reduces subjectivity in decision making both by the eyewitness and the artist while giving an opportunity for the witness to evaluate and reconsider decisions. Furthermore, the proposed approach benefits law enforcement agencies by reducing the time taken to physically draw each potential sketch, thus increasing response times and mitigating potentially malicious human intervention. With publically available 'CelebFaces Attributes Dataset' (CelebA) and additionally providing verbal description as training data, the proposed architecture is able to effectively produce facial structures from given text. Word Embeddings are learnt by applying the RNN architecture in order to perform semantic parsing, the output of which is fed into the GAN for synthesizing photo-realistic images. Rather than the grid search method, a metaheuristic search based on genetic algorithms is applied to evolve the network with the intent of achieving optimal hyperparameters in a fraction the time of a typical brute force approach. With the exception of the ‘CelebA’ training database, further novel test cases are supplied to the network for evaluation. Witness reports detailing criminals from Interpol or other law enforcement agencies are sampled on the network. Using the descriptions provided, samples are generated and compared with the ground truth images of a criminal in order to calculate the similarities. Two factors are used for performance evaluation: The Structural Similarity Index (SSIM) and the Peak Signal-to-Noise Ratio (PSNR). A high percentile output from this performance matrix should attribute to demonstrating the accuracy, in hope of proving that the proposed approach can be an effective tool for law enforcement agencies. The proposed approach to criminal facial image generation has potential to increase the ratio of criminal cases that can be ultimately resolved using eyewitness information gathering.Keywords: RNN, GAN, NLP, facial composition, criminal investigation
Procedia PDF Downloads 1592780 Electro Spinning in Nanotechnology
Authors: Mahoud Alfama, Meloud Yones, Abdelbaset Zroga, Abdelati Elalem
Abstract:
Electrospinning has been recognized as an efficient technique for the fabrication of polymer nanofibers. Various polymers have been successfully electrospun into ultrafine fibers in recent years mostly in solvent solution and some in melt form. Potential applications based on such fibers specifically their use as reinforcement in nanocomposite development have been realized. In this paper we examine -electrospinning by providing a brief description of the theory behind the process examining the effect of changing the process parameters on fiber morphology, and discussing the potential applications and impacts of electrospinning on the field of tissue engineering.Keywords: nanotechnology, electro spinning, reinforced materials
Procedia PDF Downloads 2892779 A Stochastic Analytic Hierarchy Process Based Weighting Model for Sustainability Measurement in an Organization
Authors: Faramarz Khosravi, Gokhan Izbirak
Abstract:
A weighted statistical stochastic based Analytical Hierarchy Process (AHP) model for modeling the potential barriers and enablers of sustainability for measuring and assessing the sustainability level is proposed. For context-dependent potential barriers and enablers, the proposed model takes the basis of the properties of the variables describing the sustainability functions and was developed into a realistic analytical model for the sustainable behavior of an organization. This thus serves as a means for measuring the sustainability of the organization. The main focus of this paper was the application of the AHP tool in a statistically-based model for measuring sustainability. Hence a strong weighted stochastic AHP based procedure was achieved. A case study scenario of a widely reported major Canadian electric utility was adopted to demonstrate the applicability of the developed model and comparatively examined its results with those of an equal-weighted model method. Variations in the sustainability of a company, as fluctuations, were figured out during the time. In the results obtained, sustainability index for successive years changed form 73.12%, 79.02%, 74.31%, 76.65%, 80.49%, 79.81%, 79.83% to more exact values 73.32%, 77.72%, 76.76%, 79.41%, 81.93%, 79.72%, and 80,45% according to priorities of factors that have found by expert views, respectively. By obtaining relatively necessary informative measurement indicators, the model can practically and effectively evaluate the sustainability extent of any organization and also to determine fluctuations in the organization over time.Keywords: AHP, sustainability fluctuation, environmental indicators, performance measurement
Procedia PDF Downloads 1212778 Experimental and Analytical Studies for the Effect of Thickness and Axial Load on Load-Bearing Capacity of Fire-Damaged Concrete Walls
Authors: Yeo Kyeong Lee, Ji Yeon Kang, Eun Mi Ryu, Hee Sun Kim, Yeong Soo Shin
Abstract:
The objective of this paper is an investigation of the effects of the thickness and axial loading during a fire test on the load-bearing capacity of a fire-damaged normal-strength concrete wall. Two factors are attributed to the temperature distributions in the concrete members and are mainly obtained through numerous experiments. Toward this goal, three wall specimens of different thicknesses are heated for 2 h according to the ISO-standard heating curve, and the temperature distributions through the thicknesses are measured using thermocouples. In addition, two wall specimens are heated for 2 h while simultaneously being subjected to a constant axial loading at their top sections. The test results show that the temperature distribution during the fire test depends on wall thickness and axial load during the fire test. After the fire tests, the specimens are cured for one month, followed by the loading testing. The heated specimens are compared with three unheated specimens to investigate the residual load-bearing capacities. The fire-damaged walls show a minor difference of the load-bearing capacity regarding the axial loading, whereas a significant difference became evident regarding the wall thickness. To validate the experiment results, finite element models are generated for which the material properties that are obtained for the experiment are subject to elevated temperatures, and the analytical results show sound agreements with the experiment results. The analytical method based on validated thought experimental results is applied to generate the fire-damaged walls with 2,800 mm high considering the buckling effect: typical story height of residual buildings in Korea. The models for structural analyses generated to deformation shape after thermal analysis. The load-bearing capacity of the fire-damaged walls with pin supports at both ends does not significantly depend on the wall thickness, the reason for it is restraint of pinned ends. The difference of the load-bearing capacity of fire-damaged walls as axial load during the fire is within approximately 5 %.Keywords: normal-strength concrete wall, wall thickness, axial-load ratio, slenderness ratio, fire test, residual strength, finite element analysis
Procedia PDF Downloads 2152777 Hybrid Rocket Motor Performance Parameters: Theoretical and Experimental Evaluation
Authors: A. El-S. Makled, M. K. Al-Tamimi
Abstract:
A mathematical model to predict the performance parameters (thrusts, chamber pressures, fuel mass flow rates, mixture ratios, and regression rates during firing time) of hybrid rocket motor (HRM) is evaluated. The internal ballistic (IB) hybrid combustion model assumes that the solid fuel surface regression rate is controlled only by heat transfer (convective and radiative) from flame zone to solid fuel burning surface. A laboratory HRM is designed, manufactured, and tested for low thrust profile space missions (10-15 N) and for validating the mathematical model (computer program). The polymer material and gaseous oxidizer which are selected for this experimental work are polymethyle-methacrylate (PMMA) and polyethylene (PE) as solid fuel grain and gaseous oxygen (GO2) as oxidizer. The variation of various operational parameters with time is determined systematically and experimentally in firing of up to 20 seconds, and an average combustion efficiency of 95% of theory is achieved, which was the goal of these experiments. The comparison between recording fire data and predicting analytical parameters shows good agreement with the error that does not exceed 4.5% during all firing time. The current mathematical (computer) code can be used as a powerful tool for HRM analytical design parameters.Keywords: hybrid combustion, internal ballistics, hybrid rocket motor, performance parameters
Procedia PDF Downloads 3112776 Aflatoxins Characterization in Remedial Plant-Delphinium denudatum by High-Performance Liquid Chromatography–Tandem Mass Spectrometry
Authors: Nadeem A. Siddique, Mohd Mujeeb, Kahkashan
Abstract:
Introduction: The objective of the projected work is to study the occurrence of the aflatoxins B1, B2, G1and G2 in remedial plants, exclusively in Delphinium denudatum. The aflatoxins were analysed by high-performance liquid chromatography–tandem quadrupole mass spectrometry with electrospray ionization (HPLC–MS/MS) and immunoaffinity column chromatography were used for extraction and purification of aflatoxins. PDA media was selected for fungal count. Results: A good quality linear relationship was originated for AFB1, AFB2, AFG1 and AFG2 at 1–10 ppb (r > 0.9995). The analyte precision at three different spiking levels was 88.7–109.1 %, by means of low per cent relative standard deviations in each case. Within 5 to7 min aflatoxins can be separated using an Agilent XDB C18-column. We found that AFB1 and AFB2 were not found in D. denudatum. This was reliable through exceptionally low figures of fungal colonies observed after 6 hr of incubation. The developed analytical method is straightforward, be successfully used to determine the aflatoxins. Conclusion: The developed analytical method is straightforward, simple, accurate, economical and can be successfully used to find out the aflatoxins in remedial plants and consequently to have power over the quality of products. The presence of aflatoxin in the plant extracts was interrelated to the least fungal load in the remedial plants examined.Keywords: aflatoxins, delphinium denudatum, liquid chromatography, mass spectrometry
Procedia PDF Downloads 2122775 Calculation of the Thermal Stresses in an Elastoplastic Plate Heated by Local Heat Source
Authors: M. Khaing, A. V. Tkacheva
Abstract:
The work is devoted to solving the problem of temperature stresses, caused by the heating point of the round plate. The plate is made of elastoplastic material, so the Prandtl-Reis model is used. A piecewise-linear condition of the Ishlinsky-Ivlev flow is taken as the loading surface, in which the yield stress depends on the temperature. Piecewise-linear conditions (Treska or Ishlinsky-Ivlev), in contrast to the Mises condition, make it possible to obtain solutions of the equilibrium equation in an analytical form. In the problem under consideration, using the conditions of Tresca, it is impossible to obtain a solution. This is due to the fact that the equation of equilibrium ceases to be satisfied when the two Tresca conditions are fulfilled at once. Using the conditions of plastic flow Ishlinsky-Ivlev allows one to solve the problem. At the same time, there are also no solutions on the edge of the Ishlinsky-Ivlev hexagon in the plane-stressed state. Therefore, the authors of the article propose to jump from the edge to the edge of the mine edge, which gives an opportunity to obtain an analytical solution. At the same time, there is also no solution on the edge of the Ishlinsky-Ivlev hexagon in a plane stressed state; therefore, in this paper, the authors of the article propose to jump from the side to the side of the mine edge, which gives an opportunity to receive an analytical solution. The paper compares solutions of the problem of plate thermal deformation. One of the solutions was obtained under the condition that the elastic moduli (Young's modulus, Poisson's ratio) which depend on temperature. The yield point is assumed to be parabolically temperature dependent. The main results of the comparisons are that the region of irreversible deformation is larger in the calculations obtained for solving the problem with constant elastic moduli. There is no repeated plastic flow in the solution of the problem with elastic moduli depending on temperature. The absolute value of the irreversible deformations is higher for the solution of the problem in which the elastic moduli are constant; there are also insignificant differences in the distribution of the residual stresses.Keywords: temperature stresses, elasticity, plasticity, Ishlinsky-Ivlev condition, plate, annular heating, elastic moduli
Procedia PDF Downloads 1422774 Quality by Design in the Optimization of a Fast HPLC Method for Quantification of Hydroxychloroquine Sulfate
Authors: Pedro J. Rolim-Neto, Leslie R. M. Ferraz, Fabiana L. A. Santos, Pablo A. Ferreira, Ricardo T. L. Maia-Jr., Magaly A. M. Lyra, Danilo A F. Fonte, Salvana P. M. Costa, Amanda C. Q. M. Vieira, Larissa A. Rolim
Abstract:
Initially developed as an antimalarial agent, hydroxychloroquine (HCQ) sulfate is often used as a slow-acting antirheumatic drug in the treatment of disorders of connective tissue. The United States Pharmacopeia (USP) 37 provides a reversed-phase HPLC method for quantification of HCQ. However, this method was not reproducible, producing asymmetric peaks in a long analysis time. The asymmetry of the peak may cause an incorrect calculation of the concentration of the sample. Furthermore, the analysis time is unacceptable, especially regarding the routine of a pharmaceutical industry. The aiming of this study was to develop a fast, easy and efficient method for quantification of HCQ sulfate by High Performance Liquid Chromatography (HPLC) based on the Quality by Design (QbD) methodology. This method was optimized in terms of peak symmetry using the surface area graphic as the Design of Experiments (DoE) and the tailing factor (TF) as an indicator to the Design Space (DS). The reference method used was that described at USP 37 to the quantification of the drug. For the optimized method, was proposed a 33 factorial design, based on the QbD concepts. The DS was created with the TF (in a range between 0.98 and 1.2) in order to demonstrate the ideal analytical conditions. Changes were made in the composition of the USP mobile-phase (USP-MP): USP-MP: Methanol (90:10 v/v, 80:20 v/v and 70:30 v/v), in the flow (0.8, 1.0 and 1.2 mL) and in the oven temperature (30, 35, and 40ºC). The USP method allowed the quantification of drug in a long time (40-50 minutes). In addition, the method uses a high flow rate (1,5 mL.min-1) which increases the consumption of expensive solvents HPLC grade. The main problem observed was the TF value (1,8) that would be accepted if the drug was not a racemic mixture, since the co-elution of the isomers can become an unreliable peak integration. Therefore, the optimization was suggested in order to reduce the analysis time, aiming a better peak resolution and TF. For the optimization method, by the analysis of the surface-response plot it was possible to confirm the ideal setting analytical condition: 45 °C, 0,8 mL.min-1 and 80:20 USP-MP: Methanol. The optimized HPLC method enabled the quantification of HCQ sulfate, with a peak of high resolution, showing a TF value of 1,17. This promotes good co-elution of isomers of the HCQ, ensuring an accurate quantification of the raw material as racemic mixture. This method also proved to be 18 times faster, approximately, compared to the reference method, using a lower flow rate, reducing even more the consumption of the solvents and, consequently, the analysis cost. Thus, an analytical method for the quantification of HCQ sulfate was optimized using QbD methodology. This method proved to be faster and more efficient than the USP method, regarding the retention time and, especially, the peak resolution. The higher resolution in the chromatogram peaks supports the implementation of the method for quantification of the drug as racemic mixture, not requiring the separation of isomers.Keywords: analytical method, hydroxychloroquine sulfate, quality by design, surface area graphic
Procedia PDF Downloads 6392773 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms
Authors: Seulki Lee, Seoung Bum Kim
Abstract:
Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process
Procedia PDF Downloads 2992772 Analytical and Numerical Investigation of Friction-Restricted Growth and Buckling of Elastic Fibers
Authors: Peter L. Varkonyi, Andras A. Sipos
Abstract:
The quasi-static growth of elastic fibers is studied in the presence of distributed contact with an immobile surface, subject to isotropic dry or viscous friction. Unlike classical problems of elastic stability modelled by autonomous dynamical systems with multiple time scales (slowly varying bifurcation parameter, and fast system dynamics), this problem can only be formulated as a non-autonomous system without time scale separation. It is found that the fibers initially converge to a trivial, straight configuration, which is later replaced by divergence reminiscent of buckling phenomena. In order to capture the loss of stability, a new definition of exponential stability against infinitesimal perturbations for systems defined over finite time intervals is developed. A semi-analytical method for the determination of the critical length based on eigenvalue analysis is proposed. The post-critical behavior of the fibers is studied numerically by using variational methods. The emerging post-critical shapes and the asymptotic behavior as length goes to infinity are identified for simple spatial distributions of growth. Comparison with physical experiments indicates reasonable accuracy of the theoretical model. Some applications from modeling plant root growth to the design of soft manipulators in robotics are briefly discussed.Keywords: buckling, elastica, friction, growth
Procedia PDF Downloads 1902771 A Higher Order Shear and Normal Deformation Theory for Functionally Graded Sandwich Beam
Authors: R. Bennai, H. Ait Atmane, Jr., A. Tounsi
Abstract:
In this work, a new analytical approach using a refined theory of hyperbolic shear deformation of a beam was developed to study the free vibration of graduated sandwiches beams under different boundary conditions. The effects of transverse shear strains and the transverse normal deformation are considered. The constituent materials of the beam are supposed gradually variable depending the height direction based on a simple power distribution law in terms of the volume fractions of the constituents; the two materials with which we worked are metals and ceramics. The core layer is taken homogeneous and made of an isotropic material; while the banks layers consist of FGM materials with a homogeneous fraction compared to the middle layer. Movement equations are obtained by the energy minimization principle. Analytical solutions of free vibration and buckling are obtained for sandwich beams under different support conditions; these conditions are taken into account by incorporating new form functions. In the end, illustrative examples are presented to show the effects of changes in different parameters such as (material graduation, the stretching effect of the thickness, boundary conditions and thickness ratio - length) on the vibration free and buckling of an FGM sandwich beams.Keywords: functionally graded sandwich beam, refined shear deformation theory, stretching effect, free vibration
Procedia PDF Downloads 2462770 Heat Transfer and Entropy Generation in a Partial Porous Channel Using LTNE and Exothermicity/Endothermicity Features
Authors: Mohsen Torabi, Nader Karimi, Kaili Zhang
Abstract:
This work aims to provide a comprehensive study on the heat transfer and entropy generation rates of a horizontal channel partially filled with a porous medium which experiences internal heat generation or consumption due to exothermic or endothermic chemical reaction. The focus has been given to the local thermal non-equilibrium (LTNE) model. The LTNE approach helps us to deliver more accurate data regarding temperature distribution within the system and accordingly to provide more accurate Nusselt number and entropy generation rates. Darcy-Brinkman model is used for the momentum equations, and constant heat flux is assumed for boundary conditions for both upper and lower surfaces. Analytical solutions have been provided for both velocity and temperature fields. By incorporating the investigated velocity and temperature formulas into the provided fundamental equations for the entropy generation, both local and total entropy generation rates are plotted for a number of cases. Bifurcation phenomena regarding temperature distribution and interface heat flux ratio are observed. It has been found that the exothermicity or endothermicity characteristic of the channel does have a considerable impact on the temperature fields and entropy generation rates.Keywords: entropy generation, exothermicity or endothermicity, forced convection, local thermal non-equilibrium, analytical modelling
Procedia PDF Downloads 4152769 The Innovative Use of the EPOSTL Descriptors Related to the Language Portfolio for Master Course Student-Teachers of Yerevan Brusov State University of Languages and Social Sciences
Authors: Susanna Asatryan
Abstract:
The author will introduce the Language Portfolio for master course student-teachers of Yerevan Brusov State University of Languages and Social Sciences The overall aim of the Portfolio is to serve as a visual didactic tool for the pedagogical internship of master students in specialization “A Foreign Language Teacher of High Schools and Professional Educational Institutions”, based on the principles and fundamentals of the EPOSTL. The author will present the parts of the Portfolio, including the programme, goal and objectives of student-teacher’s internship, content and organization, expected outputs and the principles of the student’s self-assessment, based on Can-do philosophy suggested by the EPOSTL. The Language Portfolio for master course student-teachers outlines the distinctive stages of their scientific-pedagogical internship. In Lesson Observation and Teaching section student teachers present thematic planning of the syllabus course, including individual lesson plan-description and analysis of the lesson. In Realization of the Scientific-Pedagogical Research section student-teachers introduce the plan of their research work, its goal, objectives, steps of procedure and outcomes. In Educational Activity section student-teachers analyze the educational sides of the lesson, they introduce the plan of the extracurricular activity, provide psycho-pedagogical description of the group or the whole class, and outline extracurricular entertainments. In the Dossier the student-teachers store up the entire instructional “product” during their pedagogical internship: e.g. samples of surveys, tests, recordings, videos, posters, postcards, pupils’ poems, photos, pictures, etc. The author’s presentation will also cover the Self Assessment Checklist, which highlights the main didactic competences of student-teachers, extracted from the EPOSTL. The Self Assessment Checklist is introduced with some innovations, taking into consideration the local educational objectives that Armenian students come across with. The students’ feedback on the use of the Portfolio will also be presented.Keywords: internship, lesson observation, can-do philosophy, self-assessment
Procedia PDF Downloads 2422768 The Exercise of Deliberative Democracy on Public Administrations Agencies' Decisions
Authors: Mauricio Filho, Carina Castro
Abstract:
The object of this project is to analyze long-time public agents that passed through several governments and see themselves in the position of having to deliberate with new agents, recently settled in the public administration. For theoretical ends, internal deliberation is understood as the one practiced on the public administration agencies, without any direct participation from the general public in the process. The assumption is: agents with longer periods of public service tend to step away from momentary political discussions that guide the current administration and seek to concentrate on institutionalized routines and procedures, making the most politically aligned individuals with the current government deliberate with less "passion" and more exchanging of knowledge and information. The theoretical framework of this research is institutionalism, which is guided by a more pragmatic view, facing the fluidity of reality in ways showing the multiple relations between agents and their respective institutions. The critical aspirations of this project rest on the works of professors Cass Sunstein, Adrian Vermeule, Philipp Pettit and in literature from both institutional theory and economic analysis of law, greatly influenced by the Chicago Law School. Methodologically, the paper is a theoretical review and pretends to be unfolded, in a future moment, in empirical tests for verification. This work has as its main analytical tool the appeal to theoretical and doctrinaire areas from the Juridical Sciences, by adopting the deductive and analytical method.Keywords: institutions, state, law, agencies
Procedia PDF Downloads 2642767 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs
Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili
Abstract:
OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.Keywords: LWD measurements, caliper log, correlations, analysis
Procedia PDF Downloads 1212766 Machine Learning Algorithms for Rocket Propulsion
Authors: Rômulo Eustáquio Martins de Souza, Paulo Alexandre Rodrigues de Vasconcelos Figueiredo
Abstract:
In recent years, there has been a surge in interest in applying artificial intelligence techniques, particularly machine learning algorithms. Machine learning is a data-analysis technique that automates the creation of analytical models, making it especially useful for designing complex situations. As a result, this technology aids in reducing human intervention while producing accurate results. This methodology is also extensively used in aerospace engineering since this is a field that encompasses several high-complexity operations, such as rocket propulsion. Rocket propulsion is a high-risk operation in which engine failure could result in the loss of life. As a result, it is critical to use computational methods capable of precisely representing the spacecraft's analytical model to guarantee its security and operation. Thus, this paper describes the use of machine learning algorithms for rocket propulsion to aid the realization that this technique is an efficient way to deal with challenging and restrictive aerospace engineering activities. The paper focuses on three machine-learning-aided rocket propulsion applications: set-point control of an expander-bleed rocket engine, supersonic retro-propulsion of a small-scale rocket, and leak detection and isolation on rocket engine data. This paper describes the data-driven methods used for each implementation in depth and presents the obtained results.Keywords: data analysis, modeling, machine learning, aerospace, rocket propulsion
Procedia PDF Downloads 1152765 Passive Aeration of Wastewater: Analytical Model
Authors: Ayman M. El-Zahaby, Ahmed S. El-Gendy
Abstract:
Aeration for wastewater is essential for the proper operation of aerobic treatment units where the wastewater normally has zero dissolved oxygen. This is due to the need of oxygen by the aerobic microorganisms to grow and survive. Typical aeration units for wastewater treatment require electric energy for their operation such as mechanical aerators or diffused aerators. The passive units are units that operate without the need of electric energy such as cascade aerators, spray aerators and tray aerators. In contrary to the cascade aerators and spray aerators, tray aerators require much smaller area foot print for their installation as the treatment stages are arranged vertically. To the extent of the authors knowledge, the design of tray aerators for the aeration purpose has not been presented in the literature. The current research concerns with an analytical study for the design of tray aerators for the purpose of increasing the dissolved oxygen in wastewater treatment systems, including an investigation on different design parameters and their impact on the aeration efficiency. The studied aerator shall act as an intermediate stage between an anaerobic primary treatment unit and an aerobic treatment unit for small scale treatment systems. Different free falling flow regimes were investigated, and the thresholds for transition between regimes were obtained from the literature. The study focused on the jetting flow regime between trays. Starting from the two film theory, an equation that relates the dissolved oxygen concentration effluent from the system was derived as a function of the flow rate, number of trays, tray area, spacing between trays, number and diameter of holes and the water temperature. A MATLab ® model was developed for the derived equation. The expected aeration efficiency under different tray configurations and operating conditions were illustrated through running the model with varying the design parameters. The impact of each parameter was illustrated. The overall system efficiency was found to increase by decreasing the hole diameter. On the other side, increasing the number of trays, tray area, flow rate per hole or tray spacing had positive effect on the system efficiency.Keywords: aeration, analytical, passive, wastewater
Procedia PDF Downloads 2092764 Renewable Energy Trends Analysis: A Patents Study
Authors: Sepulveda Juan
Abstract:
This article explains the elements and considerations taken into account when implementing and applying patent evaluation and scientometric study in the identifications of technology trends, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.Keywords: patents, scientometric, renewable energy, technology maps
Procedia PDF Downloads 307