Search results for: type I error
5384 Regional Adjustment to the Analytical Attenuation Coefficient in the GMPM BSSA 14 for the Region of Spain
Authors: Gonzalez Carlos, Martinez Fransisco
Abstract:
There are various types of analysis that allow us to involve seismic phenomena that cause strong requirements for structures that are designed by society; one of them is a probabilistic analysis which works from prediction equations that have been created based on metadata seismic compiled in different regions. These equations form models that are used to describe the 5% damped pseudo spectra response for the various zones considering some easily known input parameters. The biggest problem for the creation of these models requires data with great robust statistics that support the results, and there are several places where this type of information is not available, for which the use of alternative methodologies helps to achieve adjustments to different models of seismic prediction.Keywords: GMPM, 5% damped pseudo-response spectra, models of seismic prediction, PSHA
Procedia PDF Downloads 765383 Comics as an Intermediary for Media Literacy Education
Authors: Ryan C. Zlomek
Abstract:
The value of using comics in the literacy classroom has been explored since the 1930s. At that point in time researchers had begun to implement comics into daily lesson plans and, in some instances, had started the development process for comics-supported curriculum. In the mid-1950s, this type of research was cut short due to the work of psychiatrist Frederic Wertham whose research seemingly discovered a correlation between comic readership and juvenile delinquency. Since Wertham’s allegations the comics medium has had a hard time finding its way back to education. Now, over fifty years later, the definition of literacy is in mid-transition as the world has become more visually-oriented and students require the ability to interpret images as often as words. Through this transition, comics has found a place in the field of literacy education research as the shift focuses from traditional print to multimodal and media literacies. Comics are now believed to be an effective resource in bridging the gap between these different types of literacies. This paper seeks to better understand what students learn from the process of reading comics and how those skills line up with the core principles of media literacy education in the United States. In the first section, comics are defined to determine the exact medium that is being examined. The different conventions that the medium utilizes are also discussed. In the second section, the comics reading process is explored through a dissection of the ways a reader interacts with the page, panel, gutter, and different comic conventions found within a traditional graphic narrative. The concepts of intersubjective acts and visualization are attributed to the comics reading process as readers draw in real world knowledge to decode meaning. In the next section, the learning processes that comics encourage are explored parallel to the core principles of media literacy education. Each principle is explained and the extent to which comics can act as an intermediary for this type of education is theorized. In the final section, the author examines comics use in his computer science and technology classroom. He lays out different theories he utilizes from Scott McCloud’s text Understanding Comics and how he uses them to break down media literacy strategies with his students. The article concludes with examples of how comics has positively impacted classrooms around the United States. It is stated that integrating comics into the classroom will not solve all issues related to literacy education but, rather, that comics can be a powerful multimodal resource for educators looking for new mediums to explore with their students.Keywords: comics, graphics novels, mass communication, media literacy, metacognition
Procedia PDF Downloads 2985382 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon
Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn
Abstract:
The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.Keywords: land use and land cover change, change detection, image processing, support vector machines
Procedia PDF Downloads 1395381 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm
Authors: J. S. Dhillon, K. K. Dhaliwal
Abstract:
In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization
Procedia PDF Downloads 4775380 Conceptual Solution and Thermal Analysis of the Final Cooling Process of Biscuits in One Confectionary Factory in Serbia
Authors: Duško Salemović, Aleksandar Dedić, Matilda Lazić, Dragan Halas
Abstract:
The paper presents the conceptual solution for the final cooling of the chocolate dressing of biscuits in one confectionary factory in Serbia. The proposed concept solution was derived from the desired technological process of final cooling of biscuits and the required process parameters that were to be achieved, and which were an integral part of the project task. The desired process parameters for achieving proper hardening and coating formation are the exchanged amount of heat in the time unit between the two media (air and chocolate dressing), the speed of air inside the tunnel cooler, and the surface of all biscuits in contact with the air. These parameters were calculated in the paper. The final cooling of chocolate dressing on biscuits could be optimized by changing process parameters and dimensions of the tunnel cooler and looking for the appropriate values for them. The accurate temperature predictions and fluid flow analysis could be conducted by using heat balance and flow balance equations, having in mind the theory of similarity. Furthermore, some parameters were adopted from previous technology processes, such as the inlet temperature of biscuits and input air temperature. A thermal calculation was carried out, and it was demonstrated that the percentage error between the contact surface of the air and the chocolate biscuit topping, which is obtained from the heat balance and geometrically through the proposed conceptual solution, does not exceed 0.67%, which is a very good agreement. This enabled the quality of the cooling process of chocolate dressing applied on the biscuit and the hardness of its coating.Keywords: chocolate dressing, air, cooling, heat balance
Procedia PDF Downloads 795379 Turbulent Flow in Corrugated Pipes with Helical Grooves
Authors: P. Mendes, H. Stel, R. E. M. Morales
Abstract:
This article presents a numerical and experimental study of turbulent flow in corrugated pipes with helically “d-type" grooves, for Reynolds numbers between 7500 and 100,000. The ANSYS-CFX software is used to solve the RANS equations with the BSL two equation turbulence model, through the element-based finite-volume method approach. Different groove widths and helix angles are considered. Numerical results are validated with experimental pressure drop measurements for the friction factor. A correlation for the friction factor is also proposed considering the geometric parameters and Reynolds numbers evaluated.Keywords: turbulent flow, corrugated pipe, helical, numerical, experimental, friction factor, correlation
Procedia PDF Downloads 4845378 Energy Usage in Isolated Areas of Honduras
Authors: Bryan Jefry Sabillon, Arlex Molina Cedillo
Abstract:
Currently, the raise in the demand of electrical energy as a consequence of the development of technology and population growth, as well as some projections made by ‘La Agencia Internacional de la Energía’ (AIE) and research institutes, reveal alarming data about the expected raise of it in the next few decades. Because of this, something should be made to raise the awareness of the rational and efficient usage of this resource. Because of the global concern of providing electrical energy to isolated areas, projects consisting of energy generation using renewable resources are commonly carried out. On a socioeconomically and cultural point of view, it can be foreseen a positive impact that would result for the society to have this resource. This article is focused on the great potential that Honduras shows, as a country that is looking forward to produce renewable energy due to the crisis that it’s living nowadays. Because of this, we present a detailed research that exhibits the main necessities that the rural communities are facing today, to allay the negative aspects due to the scarcity of electrical energy. We also discuss which should be the type of electrical generation method to be used, according to the disposition, geography, climate, and of course the accessibility of each area. Honduras is actually in the process of developing new methods for the generation of energy; therefore, it is of our concern to talk about renewable energy, the exploitation of which is a global trend. Right now the countries’ main energetic generation methods are: hydrological, thermic, wind, biomass and photovoltaic (this is one of the main sources of clean electrical generation). The use of these resources was possible partially due to the studies made by the organizations that focus on electrical energy and its demand, such as ‘La Cooperación Alemana’ (GIZ), ‘La Secretaria de Energía y Recursos Naturales’ (SERNA), and ‘El Banco Centroamericano de Integración Económica’ (BCIE), which eased the complete guide that is to be used in the protocol to be followed to carry out the three stages of this type of projects: 1) Licences and Permitions, 2) Fincancial Aspects and 3) The inscription for the Protocol in Kyoto. This article pretends to take the reader through the necessary information (according to the difficult accessibility that each zone might present), about the best option of electrical generation in zones that are totally isolated from the net, pretending to use renewable resources to generate electrical energy. We finally conclude that the usage of hybrid systems of generation of energy for small remote communities brings about a positive impact, not only because of the fact of providing electrical energy but also because of the improvements in education, health, sustainable agriculture and livestock, and of course the advances in the generation of energy which is the main concern of this whole article.Keywords: energy, isolated, renewable, accessibility
Procedia PDF Downloads 2295377 Full Potential Calculation of Structural and Electronic Properties of Perovskite BiAlO3 and BiGaO3
Authors: M. Harmel, H. Khachai
Abstract:
The first principles within the full potential linearized augmented plane wave (FP-LAPW) method were applied to study the structural and electronic properties of cubic perovskite-type compounds BiAlO3 and BiGaO3. The lattice constant, bulk modulus, its pressure derivative, band structure and density of states were obtained. The results show that BiGaO3 should exhibit higher hardness and stiffness than BiAlO3. The Al–O or Ga–O bonds are typically covalent with a strong hybridization as well as Bi–O ones that have a significant ionic character. Both materials are weakly ionic and exhibit wide and indirect band gaps, which are typical of insulators.Keywords: DFT, Ab initio, electronic structure, Perovskite structure, ferroelectrics
Procedia PDF Downloads 3975376 Probiotics in Anxiety and Depression
Authors: Pilar Giffenig, Avanna Kotlarz, Taylor Dehring
Abstract:
Anxiety and depression are common mental illnesses in the U.S today. While there are various treatments for these mental health disorders, many of the medications come with a large variety of side effects that decrease medication compliance. Recent studies have looked at the impact of probiotics on anxiety and depression. Our goal was to determine whether probiotics could help relieve symptoms of anxiety and or depression. We conducted a literature search of three databases focusing on systematic reviews and RTC and found 25 articles, 8 of which were used for our analysis. Seven out of the eight articles showed that probiotics have the potential to significantly reduce symptoms of anxiety and depression. However, larger study sample sizes, type of probiotic, and correct dosage are required in future research to determine the role of probiotics in the treatment of anxiety and depression.Keywords: probiotics, anxiety, depression, treatment, psychology, nutrition
Procedia PDF Downloads 2705375 A Theoretical Approach of Tesla Pump
Authors: Cristian Sirbu-Dragomir, Stefan-Mihai Sofian, Adrian Predescu
Abstract:
This paper aims to study Tesla pumps for circulating biofluids. It is desired to make a small pump for the circulation of biofluids. This type of pump will be studied because it has the following characteristics: It doesn’t have blades which results in very small frictions; Reduced friction forces; Low production cost; Increased adaptability to different types of fluids; Low cavitation (towards 0); Low shocks due to lack of blades; Rare maintenance due to low cavity; Very small turbulences in the fluid; It has a low number of changes in the direction of the fluid (compared to rotors with blades); Increased efficiency at low powers.; Fast acceleration; The need for a low torque; Lack of shocks in blades at sudden starts and stops. All these elements are necessary to be able to make a small pump that could be inserted into the thoracic cavity. The pump will be designed to combat myocardial infarction. Because the pump must be inserted in the thoracic cavity, elements such as Low friction forces, shocks as low as possible, low cavitation and as little maintenance as possible are very important. The operation should be performed once, without having to change the rotor after a certain time. Given the very small size of the pump, the blades of a classic rotor would be very thin and sudden starts and stops could cause considerable damage or require a very expensive material. At the same time, being a medical procedure, the low cost is important in order to be easily accessible to the population. The lack of turbulence or vortices caused by a classic rotor is again a key element because when it comes to blood circulation, the flow must be laminar and not turbulent. The turbulent flow can even cause a heart attack. Due to these aspects, Tesla's model could be ideal for this work. Usually, the pump is considered to reach an efficiency of 40% being used for very high powers. However, the author of this type of pump claimed that the maximum efficiency that the pump can achieve is 98%. The key element that could help to achieve this efficiency or one as close as possible is the fact that the pump will be used for low volumes and pressures. The key elements to obtain the best efficiency for this model are the number of rotors placed in parallel and the distance between them. The distance between them must be small, which helps to obtain a pump as small as possible. The principle of operation of such a rotor is to place in several parallel discs cut inside. Thus the space between the discs creates the vacuum effect by pulling the liquid through the holes in the rotor and throwing it outwards. Also, a very important element is the viscosity of the liquid. It dictates the distance between the disks to achieve a lossless power flow.Keywords: lubrication, temperature, tesla-pump, viscosity
Procedia PDF Downloads 1795374 Fatigue-Induced Debonding Propagation in FM300 Adhesive
Authors: Reza Hedayati, Meysam Jahanbakhshi
Abstract:
Fracture Mechanics is used to predict debonding propagation in adhesive joint between aluminum and composite plates. Three types of loadings and two types of glass-epoxy composite sequences: [0/90]2s and [0/45/-45/90]s are considered for the composite plate and their results are compared. It was seen that generally the cases with stacking sequence of [0/45/-45/90]s have much shorter lives than cases with [0/90]2s. It was also seen that in cases with λ=0 the ends of the debonding front propagates forward more than its middle, while in cases with λ=0.5 or λ=1 it is vice versa. Moreover, regardless of value of λ, the difference between the debonding propagations of the ends and the middle of the debonding front is very close in cases λ=0.5 and λ=1. Another main conclusion was the non-dimensionalized debonding front profile is almost independent of sequence type or the applied load value.Keywords: adhesive joint, debonding, fracture, LEFM, APDL
Procedia PDF Downloads 3625373 Compact, Lightweight, Low Cost, Rectangular Core Power Transformers
Authors: Abidin Tortum, Kubra Kocabey
Abstract:
One of the sectors where the competition is experienced at the highest level in the world is the transformer sector, and sales can be made with a limited profit margin. For this reason, manufacturers must develop cost-cutting designs to achieve higher profits. The use of rectangular cores and coils in transformer design is one of the methods that can be used to reduce costs. According to the best knowledge we have obtained, we think that we are the first company producing rectangular core power transformers in our country. BETA, to reduce the cost of this project, more compact products to reveal, as we know it to increase the alleviate and competitiveness of the product, will perform cored coil design and production rectangle for the first-time power transformers in Turkey. The transformer to be designed shall be 16 MVA, 33/11 kV voltage level. With the rectangular design of the transformer core and windings, no-load losses can be reduced. Also, the least costly transformer type is rectangular. However, short-circuit forces on rectangular windings do not affect every point of the windings in the same way. Whereas more force is applied inwards to the mid-points of the low-voltage winding, the opposite occurs in the high-voltage winding. Therefore, the windings tend to deteriorate in the event of a short circuit. While trying to reach the project objectives, the difficulties in the design should be overcome. Rectangular core transformers to be produced in our country offer a more compact structure than conventional transformers. In other words, both height and width were smaller. Thus, the reducer takes up less space in the center. Because the transformer boiler is smaller, less oil is used, and its weight is lower. Biotemp natural ester fluid is used in rectangular transformer and the cooling performance of this oil is analyzed. The cost was also reduced with the reduction of dimensions. The decrease in the amount of oil used has also increased the environmental friendliness of the developed product. Transportation costs have been reduced by reducing the total weight. The amount of carbon emissions generated during the transportation process is reduced. Since the low-voltage winding is wound with a foil winding technique, a more resistant structure is obtained against short circuit forces. No-load losses were lower due to the use of a rectangular core. The project was handled in three phases. In the first stage, preliminary research and designs were carried out. In the second stage, the prototype manufacturing of the transformer whose designs have been completed has been started. The prototype developed in the last stage has been subjected to routine, type and special tests.Keywords: rectangular core, power transformer, transformer, productivity
Procedia PDF Downloads 1215372 Automated Video Surveillance System for Detection of Suspicious Activities during Academic Offline Examination
Authors: G. Sandhya Devi, G. Suvarna Kumar, S. Chandini
Abstract:
This research work aims to develop a system that will analyze and identify students who indulge in malpractices/suspicious activities during the course of an academic offline examination. Automated Video Surveillance provides an optimal solution which helps in monitoring the students and identifying the malpractice event immediately. This work is organized into three modules. The first module deals with performing an impersonation check using a PCA-based face recognition method which is done by cross checking his profile with the database. The presence or absence of the student is even determined in this module by implementing an image registration technique wherein a grid is formed by considering all the images registered using the frontal camera at the determined positions. Second, detecting such facial malpractices in which a student gets involved in conversation with another, trying to obtain unauthorized information etc., based on the threshold range evaluated by considering his/her mouth state whether open or closed. The third module deals with identification of unauthorized material or gadgets used in the examination hall by training the positive samples of the object through various stages. Here, a top view camera feed is analyzed to detect the suspicious activities. The system automatically alerts the administration when any suspicious activities are identified, thereby reducing the error rate caused due to manual monitoring. This work is an improvement over our previous work published in identifying suspicious activities done by examinees in an offline examination.Keywords: impersonation, image registration, incrimination, object detection, threshold evaluation
Procedia PDF Downloads 2305371 Iron Yoke Dipole with High Quality Field for Collector Ring FAIR
Authors: Tatyana Rybitskaya, Alexandr Starostenko, Kseniya Ryabchenko
Abstract:
Collector ring (CR) of FAIR project is a large acceptance storage ring and field quality plays a major role in the magnet design. The CR will use normal conducting dipole magnets. There will be 24 H-type sector magnets with a maximum field value of 1.6 T. The integrated over the length of the magnet field quality as a function of radius is ∆B.l/B.l = ±1x10⁻⁴. Below 1.6 T the value ∆B.l/B.l can be higher with a linear approximation up to ±2.5x10⁻⁴ at the field level of 0.8 T. An iron-dominated magnet with required field quality is produced with standard technology as the quality is dominated by the yoke geometry.Keywords: conventional magnet, iron yoke dipole, harmonic terms, particle accelerators
Procedia PDF Downloads 1465370 A Mixed Finite Element Formulation for Functionally Graded Micro-Beam Resting on Two-Parameter Elastic Foundation
Authors: Cagri Mollamahmutoglu, Aykut Levent, Ali Mercan
Abstract:
Micro-beams are one of the most common components of Nano-Electromechanical Systems (NEMS) and Micro Electromechanical Systems (MEMS). For this reason, static bending, buckling, and free vibration analysis of micro-beams have been the subject of many studies. In addition, micro-beams restrained with elastic type foundations have been of particular interest. In the analysis of microstructures, closed-form solutions are proposed when available, but most of the time solutions are based on numerical methods due to the complex nature of the resulting differential equations. Thus, a robust and efficient solution method has great importance. In this study, a mixed finite element formulation is obtained for a functionally graded Timoshenko micro-beam resting on two-parameter elastic foundation. In the formulation modified couple stress theory is utilized for the micro-scale effects. The equation of motion and boundary conditions are derived according to Hamilton’s principle. A functional, derived through a scientific procedure based on Gateaux Differential, is proposed for the bending and buckling analysis which is equivalent to the governing equations and boundary conditions. Most important advantage of the formulation is that the mixed finite element formulation allows usage of C₀ type continuous shape functions. Thus shear-locking is avoided in a built-in manner. Also, element matrices are sparsely populated and can be easily calculated with closed-form integration. In this framework results concerning the effects of micro-scale length parameter, power-law parameter, aspect ratio and coefficients of partially or fully continuous elastic foundation over the static bending, buckling, and free vibration response of FG-micro-beam under various boundary conditions are presented and compared with existing literature. Performance characteristics of the presented formulation were evaluated concerning other numerical methods such as generalized differential quadrature method (GDQM). It is found that with less computational burden similar convergence characteristics were obtained. Moreover, formulation also includes a direct calculation of the micro-scale related contributions to the structural response as well.Keywords: micro-beam, functionally graded materials, two-paramater elastic foundation, mixed finite element method
Procedia PDF Downloads 1625369 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 3495368 Intensity Modulated Radiotherapy of Nasopharyngeal Carcinomas: Patterns of Loco Regional Relapse
Authors: Omar Nouri, Wafa Mnejja, Nejla Fourati, Fatma Dhouib, Wicem Siala, Ilhem Charfeddine, Afef Khanfir, Jamel Daoud
Abstract:
Background and objective: Induction chemotherapy (IC) followed by concomitant chemo radiotherapy with intensity modulated radiation (IMRT) technique is actually the recommended treatment modality for locally advanced nasopharyngeal carcinomas (NPC). The aim of this study was to evaluate the prognostic factors predicting loco regional relapse with this new treatment protocol. Patients and methods: A retrospective study of 52 patients with NPC treated between June 2016 and July 2019. All patients received IC according to the protocol of the Head and Neck Radiotherapy Oncology Group (Gortec) NPC 2006 (3 TPF courses) followed by concomitant chemo radiotherapy with weekly cisplatin (40 mg / m2). Patients received IMRT with integrated simultaneous boost (SIB) of 33 daily fractions at a dose of 69.96 Gy for high-risk volume, 60 Gy for intermediate risk volume and 54 Gy for low-risk volume. Median age was 49 years (19-69) with a sex ratio of 3.3. Forty five tumors (86.5%) were classified as stages III - IV according to the 2017 UICC TNM classification. Loco regional relapse (LRR) was defined as a local and/or regional progression that occurs at least 6 months after the end of treatment. Survival analysis was performed according to Kaplan-Meier method and Log-rank test was used to compare anatomy clinical and therapeutic factors that may influence loco regional free survival (LRFS). Results: After a median follow up of 42 months, 6 patients (11.5%) experienced LRR. A metastatic relapse was also noted for 3 of these patients (50%). Target volumes coverage was optimal for all patient with LRR. Four relapses (66.6%) were in high-risk target volume and two (33.3%) were borderline. Three years LRFS was 85,9%. Four factors predicted loco regional relapses: histologic type other than undifferentiated (UCNT) (p=0.027), a macroscopic pre chemotherapy tumor volume exceeding 100 cm³ (p=0.005), a reduction in IC doses exceeding 20% (p=0.016) and a total cumulative cisplatin dose less than 380 mg/m² (p=0.0.34). TNM classification and response to IC did not impact loco regional relapses. Conclusion: For nasopharyngeal carcinoma, tumors with initial high volume and/or histologic type other than UCNT, have a higher risk of loco regional relapse. Therefore, they require a more aggressive therapeutic approaches and a suitable monitoring protocol.Keywords: loco regional relapse, modulation intensity radiotherapy, nasopharyngeal carcinoma, prognostic factors
Procedia PDF Downloads 1285367 “Voiceless Memory” and Holodomor (Great Famine): The Power of Oral History to Challenge Official Historical Discourse
Authors: Tetiana Boriak
Abstract:
The study is called to test correlation between official sources, preserved in the archives, and “unofficial” oral history regarding the Great Famine of 1932–1933 in Ukraine. The research shows poor preservation of the sources, being deliberately destroyed by the totalitarian regime. It involves analysis of five stages of Holodomor oral history development. It is oral history that provides the mechanism of mass killing. The research proves that using only one type of historical sources leads to a certain line of reading history of the Holodomor, while usage of both types provides in-depth insight in the history of the famine.Keywords: the Holodomor (the Great Famine), oral history, historical source, historical memory, totalitarianism.
Procedia PDF Downloads 1085366 Determination of the Stability of Haloperidol Tablets and Phenytoin Capsules Stored in the Inpatient Dispensary System (Swisslog) by the Respective HPLC and Raman Spectroscopy Assay
Authors: Carol Yue-En Ong, Angelina Hui-Min Tan, Quan Liu, Paul Chi-Lui Ho
Abstract:
A public general hospital in Singapore has recently implemented an automated unit-dose machine in their inpatient dispensary, Swisslog, with the objective of reducing human error and improving patient safety. However, a concern in stability arises as tablets are removed from their original packaging (bottled loose tablets/capsules) and are repackaged into individual, clear plastic wrappers as unit doses in the system. Drugs that are light-sensitive and hygroscopic would be more susceptible to degradation as the wrapper does not offer full protection. Hence, this study was carried out to study the stability of haloperidol tablets and phenytoin capsules that are light-sensitive and hygroscopic respectively. Validated HPLC-UV assays were first established for quantification of these two compounds. The medications involved were put in the Swisslog and sampled every week for one month. The collected data was analysed and showed no degradation over time. This study also explored an alternative approach for drug stability determination-Raman spectroscopy. The advantage of Raman spectroscopy is its high time efficiency and non-destructive nature. The results suggest that drug degradation can indeed be detected using Raman microscopy, but further research is needed to establish this approach for quantification or qualification of compounds. NanoRam®, a portable Raman spectrocope was also used alongside Raman microscopy but was unsuccessful in detecting degradation in this study.Keywords: drug stability, haloperidol, HPLC, phenytoin, raman spectroscopy, Swisslog
Procedia PDF Downloads 3475365 Design and Performance Improvement of Three-Dimensional Optical Code Division Multiple Access Networks with NAND Detection Technique
Authors: Satyasen Panda, Urmila Bhanja
Abstract:
In this paper, we have presented and analyzed three-dimensional (3-D) matrices of wavelength/time/space code for optical code division multiple access (OCDMA) networks with NAND subtraction detection technique. The 3-D codes are constructed by integrating a two-dimensional modified quadratic congruence (MQC) code with one-dimensional modified prime (MP) code. The respective encoders and decoders were designed using fiber Bragg gratings and optical delay lines to minimize the bit error rate (BER). The performance analysis of the 3D-OCDMA system is based on measurement of signal to noise ratio (SNR), BER and eye diagram for a different number of simultaneous users. Also, in the analysis, various types of noises and multiple access interference (MAI) effects were considered. The results obtained with NAND detection technique were compared with those obtained with OR and AND subtraction techniques. The comparison results proved that the NAND detection technique with 3-D MQC\MP code can accommodate more number of simultaneous users for longer distances of fiber with minimum BER as compared to OR and AND subtraction techniques. The received optical power is also measured at various levels of BER to analyze the effect of attenuation.Keywords: Cross Correlation (CC), Three dimensional Optical Code Division Multiple Access (3-D OCDMA), Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA), Multiple Access Interference (MAI), Phase Induced Intensity Noise (PIIN), Three Dimensional Modified Quadratic Congruence/Modified Prime (3-D MQC/MP) code
Procedia PDF Downloads 4125364 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.Keywords: analytics, digitization, industry 4.0, manufacturing
Procedia PDF Downloads 1115363 Embedded System of Signal Processing on FPGA: Underwater Application Architecture
Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad
Abstract:
The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing
Procedia PDF Downloads 795362 Diet and Exercise Intervention and Bio–Atherogenic Markers for Obesity Classes of Black South Africans with Type 2 Diabetes Mellitus Using Discriminant Analysis
Authors: Oladele V. Adeniyi, B. Longo-Mbenza, Daniel T. Goon
Abstract:
Background: Lipids are often low or in the normal ranges and controversial in the atherogenesis among Black Africans. The effect of the severity of obesity on some traditional and novel cardiovascular disease risk factors is unclear before and after a diet and exercise maintenance programme among obese black South Africans with type 2 diabetes mellitus (T2DM). Therefore, this study aimed to identify the risk factors to discriminate obesity classes among patients with T2DM before and after a diet and exercise programme. Methods: This interventional cohort of Black South Africans with T2DM was followed by a very – low calorie diet and exercise programme in Mthatha, between August and November 2013. Gender, age, and the levels of body mass index (BMI), blood pressure, monthly income, daily frequency of meals, blood random plasma glucose (RPG), serum creatinine, total cholesterol (TC), triglycerides (TG), LDL –C, HDL – C, Non-HDL, ratios of TC/HDL, TG/HDL, and LDL/HDL were recorded. Univariate analysis (ANOVA) and multivariate discriminant analysis were performed to separate obesity classes: normal weight (BMI = 18.5 – 24.9 kg/m2), overweight (BMI = 25 – 29.9 kg/m2), obesity Class 1 (BMI = 30 – 34.9 kg/m2), obesity Class 2 (BMI = 35 – 39.9 kg/m2), and obesity Class 3 (BMI ≥ 40 kg/m2). Results: At the baseline (1st Month September), all 327 patients were overweight/obese: 19.6% overweight, 42.8% obese class 1, 22.3% obese class 2, and 15.3% obese class 3. In discriminant analysis, only systolic blood pressure (SBP with positive association) and LDL/HDL ratio (negative association) significantly separated increasing obesity classes. At the post – evaluation (3rd Month November), out of all 327 patients, 19.9%, 19.3%, 37.6%, 15%, and 8.3% had normal weight, overweight, obesity class 1, obesity class 2, and obesity class 3, respectively. There was a significant negative association between serum creatinine and increase in BMI. In discriminant analysis, only age (positive association), SBP (U – shaped relationship), monthly income (inverted U – shaped association), daily frequency of meals (positive association), and LDL/HDL ratio (positive association) classified significantly increasing obesity classes. Conclusion: There is an epidemic of diabesity (Obesity + T2DM) in this Black South Africans with some weight loss. Further studies are needed to understand positive or negative linear correlations and paradoxical curvilinear correlations between these markers and increase in BMI among black South African T2DM patients.Keywords: atherogenic dyslipidaemia, dietary interventions, obesity, south africans
Procedia PDF Downloads 3675361 Modeling of Landslide-Generated Tsunamis in Georgia Strait, Southern British Columbia
Authors: Fatemeh Nemati, Lucinda Leonard, Gwyn Lintern, Richard Thomson
Abstract:
In this study, we will use modern numerical modeling approaches to estimate tsunami risks to the southern coast of British Columbia from landslides. Wave generation is to be simulated using the NHWAVE model, which solves the Navier-Stokes equations due to the more complex behavior of flow near the landslide source; far-field wave propagation will be simulated using the simpler model FUNWAVE_TVD with high-order Boussinesq-type wave equations, with a focus on the accurate simulation of wave propagation and regional- or coastal-scale inundation predictions.Keywords: FUNWAVE-TVD, landslide-generated tsunami, NHWAVE, tsunami risk
Procedia PDF Downloads 1555360 CPPI Method with Conditional Floor: The Discrete Time Case
Authors: Hachmi Ben Ameur, Jean Luc Prigent
Abstract:
We propose an extension of the CPPI method, which is based on conditional floors. In this framework, we examine in particular the TIPP and margin based strategies. These methods allow keeping part of the past gains and protecting the portfolio value against future high drawdowns of the financial market. However, as for the standard CPPI method, the investor can benefit from potential market rises. To control the risk of such strategies, we introduce both Value-at-Risk (VaR) and Expected Shortfall (ES) risk measures. For each of these criteria, we show that the conditional floor must be higher than a lower bound. We illustrate these results, for a quite general ARCH type model, including the EGARCH (1,1) as a special case.Keywords: CPPI, conditional floor, ARCH, VaR, expected ehortfall
Procedia PDF Downloads 3055359 Using Locus Equations for Berber Consonants Labiovellarization
Authors: Ali Benali Djouher Leila
Abstract:
Labiovelarization of velar consonants and labials is a very widespread phenomenon. It is attested in all the major northern Berber dialects. Only the Tuareg is totally unaware of it. But, even within the large Berber-speaking regions of the north, it is very unstable: it may be completely absent in certain dialects (such as the Bougie region in Kabylie), and its extension and frequency can vary appreciably between the dialects which know it. Some dialects of Great Kabylia or the Chleuh domain, for example, "labiovélarize" more than others from the same region. Thus, in Great Kabylia, the adjective "large" will be pronounced: amqqwran with the At Yiraten and amqqran with the At Yanni, a few kilometers away. One of the problems with them is deciding whether it is one or two phonemes. All the criteria used by linguists in this kind of case lead to the conclusion that they are unique phonemes (a phoneme and not a succession of two phonemes, / k + w /, for example). The phonetic and phonological criteria are moreover clearly confirmed by the morphological data since, in the system of verbal alternations, these complex segments are treated as single phonemes: agree, "to draw, to fetch water," akwer, "to fly," have exactly the same morphology as as "jealous," arem" taste," Ames, "dirty" or afeg, "steal" ... verbs with two radical consonants (type aCC). At the level of notation, both scientific and usual, it is, therefore, necessary to represent the labiovélarized by a single letter, possibly accompanied by a diacritic. In fact, actual practices are diverse. - The scientific representation of type does not seem adequate for current use because its realization is easy only on a microcomputer. The Berber Documentation File used a small ° (of n °) above the writing line: k °, g ° ... which has the advantage of being easy to achieve since it is part of general typographical conventions in Latin script and that it is present on a typewriter keyboard. Mouloud Mammeri, then the Berber Study Group of Vincennes (Tisuraf review), and a majority of Kabyle practitioners over the last twenty years have used the succession "consonant +" semi-vowel / w / "(CW) on the same line of writing; for all the reasons explained previously, this practice is not a good solution and should be abandoned, especially as it particularizes Kabyle in the Berber ensemble. In this study, we were interested in two velar consonants, / g / and / k /, labiovellarized: / gw / and the / kw / (we adopted the addition of the "w") for the representation for ease of writing in graphical mode. It is a question of trying to characterize these four consonants in order to see if they have different places of articulation and if they are distinct (if these velars are distinct from their labiovellarized counterpart). This characterization is done using locus equations.Keywords: berber consonants;, labiovelarization, locus equations, acoustical caracterization, kabylian dialect, algerian language
Procedia PDF Downloads 765358 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada
Authors: Bilel Chalghaf, Mathieu Varin
Abstract:
Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR
Procedia PDF Downloads 1345357 Characterization of a Lipolytic Enzyme of Pseudomonas nitroreducens Isolated from Mealworm's Gut
Authors: Jung-En Kuan, Whei-Fen Wu
Abstract:
In this study, a symbiotic bacteria from yellow mealworm's (Tenebrio molitor) mid-gut was isolated with characteristics of growth on minimal-tributyrin medium. After a PCR-amplification of its 16s rDNA, the resultant nucleotide sequences were then analyzed by schemes of the phylogeny trees. Accordingly, it was designated as Pseudomonas nitroreducens D-01. Next, by searching the lipolytic enzymes in its protein data bank, one of those potential lipolytic α/β hydrolases was identified, again using PCR-amplification and nucleotide-sequencing methods. To construct an expression of this lipolytic gene in plasmids, the target-gene primers were then designed, carrying the C-terminal his-tag sequences. Using the vector pET21a, a recombinant lipolytic hydrolase D gene with his-tag nucleotides was successfully cloned into it, of which the lipolytic D gene is under a control of the T7 promoter. After transformation of the resultant plasmids into Eescherichia coli BL21 (DE3), an IPTG inducer was used for the induction of the recombinant proteins. The protein products were then purified by metal-ion affinity column, and the purified proteins were found capable of forming a clear zone on tributyrin agar plate. Shortly, its enzyme activities were determined by degradation of p-nitrophenyl ester(s), and the substantial yellow end-product, p-nitrophenol, was measured at O.D.405 nm. Specifically, this lipolytic enzyme efficiently targets p-nitrophenyl butyrate. As well, it shows the most reactive activities at 40°C, pH 8 in potassium phosphate buffer. In thermal stability assays, the activities of this enzyme dramatically drop when the temperature is above 50°C. In metal ion assays, MgCl₂ and NH₄Cl induce the enzyme activities while MnSO₄, NiSO₄, CaCl₂, ZnSO₄, CoCl₂, CuSO₄, FeSO₄, and FeCl₃ reduce its activities. Besides, NaCl has no effects on its enzyme activities. Most organic solvents decrease the activities of this enzyme, such as hexane, methanol, ethanol, acetone, isopropanol, chloroform, and ethyl acetate. However, its enzyme activities increase when DMSO exists. All the surfactants like Triton X-100, Tween 80, Tween 20, and Brij35 decrease its lipolytic activities. Using Lineweaver-Burk double reciprocal methods, the function of the enzyme kinetics were determined such as Km = 0.488 (mM), Vmax = 0.0644 (mM/min), and kcat = 3.01x10³ (s⁻¹), as well the total efficiency of kcat/Km is 6.17 x10³ (mM⁻¹/s⁻¹). Afterwards, based on the phylogenetic analyses, this lipolytic protein is classified to type IV lipase by its homologous conserved region in this lipase family.Keywords: enzyme, esterase, lipotic hydrolase, type IV
Procedia PDF Downloads 1335356 Prevalence of Near Visual Impairment and Associated Factors among School Teachers in Gondar City, North West Ethiopia, 2022
Authors: Bersufekad Wubie
Abstract:
Introduction: Near visual impairment is presenting near visual acuity of the eye worse than N6 at a 40 cm distance. Teachers' regular duties, such as reading books, writing on the blackboard, and recognizing students' faces, need good near vision. If a teacher has near-visual impairment, the work output is unsatisfactory. Objective: The study was aimed to assess the prevalence and associated factors near vision impairment among school teachers at Gondar city Northwest Ethiopia, August 2022. Methods: To select 567 teachers in Gondar city schools, an institutional-based cross-sectional study design with a multistage sampling technique were used. The study was conducted in selected schools from May 1 to May 30, 2022. Trained data collectors used well-structured Amharic and English language questionnaires and ophthalmic instruments for examination. The collected data were checked for completeness and entered into Epi data version 4.6, then exported to SPSS version 26 for further analysis. A binary and multivariate logistic regression model was fitted. And associated factors of the outcome variable. Result: The prevalence of near visual impairment was 64.6%, with a confidence interval of 60.3%–68.4%. Near visual impairment was significantly associated with age >= 35 years (AOR: 4.90 at 95% CI: 3.15, 7.65), having prolonged years of teaching experience (AOR: 3.29 at 95% CI: 1.70, 4.62), having a history of ocular surgery (AOR: 1.96 at 95% CI: 1.10, 4.62), smokers (AOR: 2.21 at 95% CI: 1.22, 4.07), history of ocular trauma (AOR : 1.80 at 95%CI:1.11,3.18 and uncorrected refractive error (AOR:2.01 at 95%CI:1.13,4.03). Conclusion and recommendations: This study showed the prevalence of near vision impairment among school teachers was high, and it is not a problem of the presbyopia age group alone; it also happens at a young age. So teachers' ocular health should be well accommodated in the school's eye health.Keywords: Gondar, near visual impairment, school, teachers
Procedia PDF Downloads 1385355 Investigating the Effects of Empowering the Employees in Managing Crimes by the Police
Authors: Akbar Salimi, Mehdi Moghimi
Abstract:
Goal: The human resource empowerment is a new strategy in achieving a competitive advantage. The aim of the research is to understand crime management by the police by using this strategy. Method: The research is applied in terms of goal and it is a survey type research. The sample intended include all the police officers of a police station for as many as 52 people. The data were collected by a researcher made four choice questionnaire after the validity and reliability were confirmed. Findings: By regarding the Melhem pattern as the framework, four dimensions of empowerment were identified and the triangle of crime was explained and then four hypotheses proportionate to it were formulated. Results: Given the fact that the sample was all counted, all the four hypotheses were supported by using the average data received and by regarding the %50 as the criterion.Keywords: management, empowerment, employees, police
Procedia PDF Downloads 373