Search results for: standard procedures process
20099 National Assessment for Schools in Saudi Arabia: Score Reliability and Plausible Values
Authors: Dimiter M. Dimitrov, Abdullah Sadaawi
Abstract:
The National Assessment for Schools (NAFS) in Saudi Arabia consists of standardized tests in Mathematics, Reading, and Science for school grade levels 3, 6, and 9. One main goal is to classify students into four categories of NAFS performance (minimal, basic, proficient, and advanced) by schools and the entire national sample. The NAFS scoring and equating is performed on a bounded scale (D-scale: ranging from 0 to 1) in the framework of the recently developed “D-scoring method of measurement.” The specificity of the NAFS measurement framework and data complexity presented both challenges and opportunities to (a) the estimation of score reliability for schools, (b) setting cut-scores for the classification of students into categories of performance, and (c) generating plausible values for distributions of student performance on the D-scale. The estimation of score reliability at the school level was performed in the framework of generalizability theory (GT), with students “nested” within schools and test items “nested” within test forms. The GT design was executed via a multilevel modeling syntax code in R. Cut-scores (on the D-scale) for the classification of students into performance categories was derived via a recently developed method of standard setting, referred to as “Response Vector for Mastery” (RVM) method. For each school, the classification of students into categories of NAFS performance was based on distributions of plausible values for the students’ scores on NAFS tests by grade level (3, 6, and 9) and subject (Mathematics, Reading, and Science). Plausible values (on the D-scale) for each individual student were generated via random selection from a statistical logit-normal distribution with parameters derived from the student’s D-score and its conditional standard error, SE(D). All procedures related to D-scoring, equating, generating plausible values, and classification of students into performance levels were executed via a computer program in R developed for the purpose of NAFS data analysis.Keywords: large-scale assessment, reliability, generalizability theory, plausible values
Procedia PDF Downloads 1820098 Chromatography Study of Fundamental Properties of Medical Radioisotope Astatine-211
Authors: Evgeny E. Tereshatov
Abstract:
Astatine-211 is considered one of the most promising radionuclides for Targeted Alpha Therapy. In order to develop reliable procedures to label biomolecules and utilize efficient delivery vehicle principles, one should understand the main chemical characteristics of astatine. The short half-life of 211At (~7.2 h) and absence of any stable isotopes of this element are limiting factors towards studying the behavior of astatine. Our team has developed a procedure for rapid and efficient isolation of astatine from irradiated bismuth material in nitric acid media based on 3-octanone and 1-octanol extraction chromatography resins. This process has been automated and it takes 20 min from the beginning of the target dissolution to the At-211 fraction elution. Our next step is to consider commercially available chromatography resins and their applicability in astatine purification in the same media. Results obtained along with the corresponding sorption mechanisms will be discussed.Keywords: astatine-211, chromatography, automation, mechanism, radiopharmaceuticals
Procedia PDF Downloads 9220097 Culturing of Bovine Pre-Compacted Morlae in TCM-199 and Baf in a Standard 5% CO2 Laboratory Incubator and in the Vagina of a Goat Doe
Authors: Daniel M. Barry
Abstract:
Since more than half a century ago, attempts have been made to culture cells and embryos outside the body (in vitro or ex vivo). This was done with different culture media and in various “incubators”. In the present study two different culture media were used: a standard TCM-199 culture medium and first trimester amniotic fluid (BAF) collected sterilely from pregnant cows after slaughter. Two different culture conditions were also investigated, the standard laboratory CO2 incubator versus culturing bovine embryos in the vagina of a goat doe. Two experiments were done: Firstly the permeability of different receptacles to CO2 gas was analyzed for possible culture in the vagina. Four-well plates and straws were used to incubate TCM-199 and BAF for a period of 120 h in the presence or absence of 5% CO2 gas. The pH values were measured and recorded every 24 h. In the second experiment pre-compacted morula stage bovine embryos were cultured in the above culture media in sealed 0.25 mL straws in a standard laboratory incubator and in the vagina of a goat doe. Evaluation was done on (1) stage of development and (2) number of blastomeres after 96 h of culture. In the first experiment it was shown that the CO2 gas diffused out of the 4-well plate as well as through the wall of the straws in the absence of CO2 gas, while in the presence of CO2 the pH of both media stabilized between 7.3 and 7.5. This meant that the semen straws were permeable to CO2 gas and could therefore be used as receptacles for culturing early stage bovine embryos. In the second experiment no statistical differences (p>0.05) were found in the number of pre-compacted bovine embryos that developed to the blastocyst stage, or the hatched blastocyst stage, neither for the culture medium used, or the method of culturing in the two incubators. Neither was there any difference (p>0.05) in the number of blastomeres that developed at the blastocyst stage between the two types of incubators. The bovine embryos tended to develop more blastomeres when cultured in BAF than when cultured in TCM-199 in both the standard laboratory incubator and when using the vagina of a goat doe as an incubator.Keywords: alternative culture, bovine embryos, vagina, bovine amniotic fluid, incubator
Procedia PDF Downloads 49020096 Neutral Heavy Scalar Searches via Standard Model Gauge Boson Decays at the Large Hadron Electron Collider with Multivariate Techniques
Authors: Luigi Delle Rose, Oliver Fischer, Ahmed Hammad
Abstract:
In this article, we study the prospects of the proposed Large Hadron electron Collider (LHeC) in the search for heavy neutral scalar particles. We consider a minimal model with one additional complex scalar singlet that interacts with the Standard Model (SM) via mixing with the Higgs doublet, giving rise to an SM-like Higgs boson and a heavy scalar particle. Both scalar particles are produced via vector boson fusion and can be tested via their decays into pairs of SM particles, analogously to the SM Higgs boson. Using multivariate techniques, we show that the LHeC is sensitive to heavy scalars with masses between 200 and 800 GeV down to scalar mixing of order 0.01.Keywords: beyond the standard model, large hadron electron collider, multivariate analysis, scalar singlet
Procedia PDF Downloads 13720095 Optimization of Electrocoagulation Process Using Duelist Algorithm
Authors: Totok R. Biyanto, Arif T. Mardianto, M. Farid R. R., Luthfi Machmudi, kandi mulakasti
Abstract:
The main objective of this research is optimizing the electrocoagulation process design as a post-treatment for biologically vinasse effluent process. The first principle model with three independent variables that affect the energy consumption of electrocoagulation process i.e. current density, electrode distance, and time of treatment process are chosen as optimized variables. The process condition parameters were determined with the value of pH, electrical conductivity, and temperature of vinasse about 6.5, 28.5 mS/cm, 52 oC, respectively. Aluminum was chosen as the electrode material of electrocoagulation process. Duelist algorithm was used as optimization technique due to its capability to reach a global optimum. The optimization results show that the optimal process can be reached in the conditions of current density of 2.9976 A/m2, electrode distance of 1.5 cm and electrolysis time of 119 min. The optimized energy consumption during process is 34.02 Wh.Keywords: optimization, vinasse effluent, electrocoagulation, energy consumption
Procedia PDF Downloads 46920094 The Effect of Transparent Oil Wood Stain on the Colour Stability of Spruce Wood during Weathering
Authors: Eliska Oberhofnerova, Milos Panek, Stepan Hysek, Martin Lexa
Abstract:
Nowadays the use of wood, both indoors and outdoors, is constantly increasing. However wood is a natural organic material and in the exterior is subjected to a degradation process caused by abiotic factors (solar radiation, rain, moisture, wind, dust etc.). This process affects only surface layers of wood but neglecting some of the basic rules of wood protection leads to increased possibility of biological agents attack and thereby influences a function of the wood element. The process of wood degradation can be decreased by proper surface treatment, especially in the case of less naturally durable wood species, as spruce. Modern coating systems are subjected to many requirements such as colour stability, hydrophobicity, low volatile organic compound (VOC) content, long service life or easy maintenance. The aim of this study is to evaluate the colour stability of spruce wood (Picea abies), as the basic parameter indicating the coating durability, treated with two layers of transparent natural oil wood stain and exposed to outdoor conditions. The test specimens were exposed for 2 years to natural weathering and 2000 hours to artificial weathering in UV-chamber. The colour parameters were measured before and during exposure to weathering by the spectrophotometer according to CIELab colour space. The comparison between untreated and treated wood and both testing procedures was carried out. The results showed a significant effect of coating on the colour stability of wood, as expected. Nevertheless, increasing colour changes of wood observed during the exposure to weathering differed according to applied testing procedure - natural and artificial.Keywords: colour stability, natural and artificial weathering, spruce wood, transparent coating
Procedia PDF Downloads 22020093 The Falling Point of Lubricant
Authors: Arafat Husain
Abstract:
The lubricants are one of the most used resource in today’s world. Lot of the superpowers are dependent on the lubricant resource for their country to function. To see that the lubricants are not adulterated we need to develop some efficient ways and to see which fluid has been added to the lubricant. So to observe the these malpractices in the lubricant we need to develop a method. We take a elastic ball and through it at probability circle in the submerged in the lubricant at a fixed force and see the distance of pitching and the point of fall. Then we the ratio of distance of falling to the distance of pitching and if the measured ratio is greater than one the fluid is less viscous and if the ratio is lesser than the lubricant is viscous. We will check the falling point of pure lubricant at fixed force and every pure lubricant would have a fixed falling point. After that we would adulterate the lubricant and note the falling point and if the falling point is less than the standard value then adulterate is solid and if the adulterate is liquid the falling point will be more than the standard value. Hence the comparison with the standard falling point will give the efficiency of the lubricant.Keywords: falling point of lubricant, falling point ratios, probability circle, octane number
Procedia PDF Downloads 49520092 The Contrastive Survey of Phonetic Structure in Two Iranian Dialects
Authors: Iran Kalbasi, Foroozandeh Zardashti
Abstract:
Dialectology is a branch of social linguistics that studies systematic language variations. Dialects are the branches of a unique language that have structural, morphological and phonetic differences with each other. In Iran, these dialects and language variations themselves have a lot of cultural loads, and studying them have linguistic and cultural importance. In this study, phonetic structure of two Iranian dialects, Bakhtiyari Lori of Masjedsoleyman and Shushtari in Khuzestan Province of Iran have been surveyed. Its statistical community includes twenty speakers of two dialects. The theoretic bases of this research is based on structuralism. Its data have been collected by interviewing the questionnaire that consist of 3000 words, 410 sentences and 110 complex and simple verbs. These datas are analysed and described synchronically. Then, the phonetic characteristics of these two dialects and standard Persian have been compared. Therefore, we can say that in phonetic level of these two dialects and standard Persian, there are clearly differences.Keywords: standard language, dialectology, bakhtiyari lori dialect of Masjedsoleyman, Shushtari dialect, vowel, consonant
Procedia PDF Downloads 59320091 Attention and Memory in the Music Learning Process in Individuals with Visual Impairments
Authors: Lana Burmistrova
Abstract:
Introduction: The influence of visual impairments on several cognitive processes used in the music learning process is an increasingly important area in special education and cognitive musicology. Many children have several visual impairments due to the refractive errors and irreversible inhibitors. However, based on the compensatory neuroplasticity and functional reorganization, congenitally blind (CB) and early blind (EB) individuals use several areas of the occipital lobe to perceive and process auditory and tactile information. CB individuals have greater memory capacity, memory reliability, and less false memory mechanisms are used while executing several tasks, they have better working memory (WM) and short-term memory (STM). Blind individuals use several strategies while executing tactile and working memory n-back tasks: verbalization strategy (mental recall), tactile strategy (tactile recall) and combined strategies. Methods and design: The aim of the pilot study was to substantiate similar tendencies while executing attention, memory and combined auditory tasks in blind and sighted individuals constructed for this study, and to investigate attention, memory and combined mechanisms used in the music learning process. For this study eight (n=8) blind and eight (n=8) sighted individuals aged 13-20 were chosen. All respondents had more than five years music performance and music learning experience. In the attention task, all respondents had to identify pitch changes in tonal and randomized melodic pairs. The memory task was based on the mismatch negativity (MMN) proportion theory: 80 percent standard (not changed) and 20 percent deviant (changed) stimuli (sequences). Every sequence was named (na-na, ra-ra, za-za) and several items (pencil, spoon, tealight) were assigned for each sequence. Respondents had to recall the sequences, to associate them with the item and to detect possible changes. While executing the combined task, all respondents had to focus attention on the pitch changes and had to detect and describe these during the recall. Results and conclusion: The results support specific features in CB and EB, and similarities between late blind (LB) and sighted individuals. While executing attention and memory tasks, it was possible to observe the tendency in CB and EB by using more precise execution tactics and usage of more advanced periodic memory, while focusing on auditory and tactile stimuli. While executing memory and combined tasks, CB and EB individuals used passive working memory to recall standard sequences, active working memory to recall deviant sequences and combined strategies. Based on the observation results, assessment of blind respondents and recording specifics, following attention and memory correlations were identified: reflective attention and STM, reflective attention and periodic memory, auditory attention and WM, tactile attention and WM, auditory tactile attention and STM. The results and the summary of findings highlight the attention and memory features used in the music learning process in the context of blindness, and the tendency of the several attention and memory types correlated based on the task, strategy and individual features.Keywords: attention, blindness, memory, music learning, strategy
Procedia PDF Downloads 18420090 The Effect of an e-Learning Program of Basic Cardiopulmonary Resuscitation for Students of an Emergency Medical Technician Program
Authors: Itsaree Padphai, Jiranan Pakpeian, Suksun Niponchai
Abstract:
This study is a descriptive research which aims to: 1) Compare the difference of knowledge before and after using the e-Learning program entitled “Basic Cardiopulmonary Resuscitation for Students in an Emergency Medical Technician Diploma Program”, and 2) Assess the students’ satisfaction after using the said program. This research is a kind of teaching and learning management supplemented with the e-Learning system; therefore, the purposively selected samples are 44 first-year and class-16 students of an emergency medical technician diploma program who attend the class in a second semester of academic year 2012 in Sirindhorn College of Public Health, Khon Kaen province. The research tools include 1) the questionnaire for general information of the respondents, 2) the knowledge tests before and after using the e-Learning program, and 3) an assessment of satisfaction in using the e-Learning program. The statistics used in data analysis percentage, include mean, standard deviation, and inferential statistics: paired t-test. 1. The general information of the respondents was mostly 37 females representing 84.09 percent. The average age was 19.5 years (standard deviation was 0.81), the maximum age was 21 years, and the minimum age was 19 years respectively. Students (35 subjects) admitted that they preferred the methods of teaching and learning by using the e-Learning systems. This was totally 79.95 percent. 2. A comparison on the difference of knowledge before and after using the e-Learning program showed that the mean before an application was 6.64 (standard deviation was 1.94) and after was 18.84 (standard deviation 1.03), which was higher than the knowledge of students before using the e-Learning program with the statistical significance (P value < 0.001). 3. For the satisfaction after using the e-Learning program, it was found that students’ satisfaction was at a very good level with the mean of 4.93 (standard deviation was 0.11).Keywords: e-Learning, cardiopulmonary resuscitation, diploma program, Khon Kaen Province
Procedia PDF Downloads 39920089 Experimental Investigation of Absorbent Regeneration Techniques to Lower the Cost of Combined CO₂ and SO₂ Capture Process
Authors: Bharti Garg, Ashleigh Cousins, Pauline Pearson, Vincent Verheyen, Paul Feron
Abstract:
The presence of SO₂ in power plant flue gases makes flue gas desulfurization (FGD) an essential requirement prior to post combustion CO₂ (PCC) removal facilities. Although most of the power plants worldwide deploy FGD in order to comply with environmental regulations, generally the achieved SO₂ levels are not sufficiently low for the flue gases to enter the PCC unit. The SO₂ level in the flue gases needs to be less than 10 ppm to effectively operate the PCC installation. The existing FGD units alone cannot bring down the SO₂ levels to or below 10 ppm as required for CO₂ capture. It might require an additional scrubber along with the existing FGD unit to bring the SO₂ to the desired levels. The absence of FGD units in Australian power plants brings an additional challenge. SO₂ concentrations in Australian power station flue gas emissions are in the range of 100-600 ppm. This imposes a serious barrier on the implementation of standard PCC technologies in Australia. CSIRO’s developed CS-Cap process is a unique solution to capture SO₂ and CO₂ in a single column with single absorbent which can potentially bring cost-effectiveness to the commercial deployment of carbon capture in Australia, by removing the need for FGD. Estimated savings of removing SO₂ through a similar process as CS-Cap is around 200 MMUSD for a 500 MW Australian power plant. Pilot plant trials conducted to generate the proof of concept resulted in 100% removal of SO₂ from flue gas without utilising standard limestone-based FGD. In this work, removal of absorbed sulfur from aqueous amine absorbents generated in the pilot plant trials has been investigated by reactive crystallisation and thermal reclamation. More than 95% of the aqueous amines can be reclaimed back from the sulfur loaded absorbent via reactive crystallisation. However, the recovery of amines through thermal reclamation is limited and depends on the sulfur loading on the spent absorbent. The initial experimental work revealed that reactive crystallisation is a better fit for CS-Cap’s sulfur-rich absorbent especially when it is also capable of generating K₂SO₄ crystals of highly saleable quality ~ 99%. Initial cost estimation carried on both the technologies resulted in almost similar capital expenditure; however, the operating cost is considerably higher in thermal reclaimer than that in crystalliser. The experimental data generated in the laboratory from both the regeneration techniques have been used to generate the simulation model in Aspen Plus. The simulation model illustrates the economic benefits which could be gained by removing flue gas desulfurization prior to standard PCC unit and replacing it with a CS-Cap absorber column co-capturing CO₂ and SO₂, and it's absorbent regeneration system which would be either reactive crystallisation or thermal reclamation.Keywords: combined capture, cost analysis, crystallisation, CS-Cap, flue gas desulfurisation, regeneration, sulfur, thermal reclamation
Procedia PDF Downloads 12720088 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics
Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir
Abstract:
Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone
Procedia PDF Downloads 19320087 Life Cycle Assessment of Mass Timber Structure, Construction Process as System Boundary
Authors: Mahboobeh Hemmati, Tahar Messadi, Hongmei Gu
Abstract:
Today, life cycle assessment (LCA) is a leading method in mitigating the environmental impacts emerging from the building sector. In this paper, LCA is used to quantify the Green House Gas (GHG) emissions during the construction phase of the largest mass timber residential structure in the United States, Adohi Hall. This building is a 200,000 square foot 708-bed complex located on the campus of the University of Arkansas. The energy used for buildings’ operation is the most dominant source of emissions in the building industry. Lately, however, the efforts were successful at increasing the efficiency of building operation in terms of emissions. As a result, the attention is now shifted to the embodied carbon, which is more noticeable in the building life cycle. Unfortunately, most of the studies have, however, focused on the manufacturing stage, and only a few have addressed to date the construction process. Specifically, less data is available about environmental impacts associated with the construction of mass timber. This study presents, therefore, an assessment of the environmental impact of the construction processes based on the real and newly built mass timber building mentioned above. The system boundary of this study covers modules A4 and A5 based on building LCA standard EN 15978. Module A4 includes material and equipment transportation. Module A5 covers the construction and installation process. This research evolves through 2 stages: first, to quantify materials and equipment deployed in the building, and second, to determine the embodied carbon associated with running equipment for construction materials, both transported to, and installed on, the site where the edifice is built. The Global Warming Potential (GWP) of the building is the primary metric considered in this research. The outcomes of this study bring to the front a better understanding of hotspots in terms of emission during the construction process. Moreover, the comparative analysis of the mass timber construction process with that of a theoretically similar steel building will enable an effective assessment of the environmental efficiency of mass timber.Keywords: construction process, GWP, LCA, mass timber
Procedia PDF Downloads 16620086 A Generalized Weighted Loss for Support Vextor Classification and Multilayer Perceptron
Authors: Filippo Portera
Abstract:
Usually standard algorithms employ a loss where each error is the mere absolute difference between the true value and the prediction, in case of a regression task. In the present, we present several error weighting schemes that are a generalization of the consolidated routine. We study both a binary classification model for Support Vextor Classification and a regression net for Multylayer Perceptron. Results proves that the error is never worse than the standard procedure and several times it is better.Keywords: loss, binary-classification, MLP, weights, regression
Procedia PDF Downloads 9520085 Ways for University to Conduct Research Evaluation: Based on National Research University Higher School of Economics Example
Authors: Svetlana Petrikova, Alexander Yu Kostinskiy
Abstract:
Management of research evaluation in the Higher School of Economics (HSE) originates from the HSE Academic Fund created in 2004 to facilitate and support academic research and presents its results to international academic community. As the means to inspire the applicants, science projects went through competitive selection process evaluated by the group of experts. Drastic development of HSE, quantity of applied projects for each Academic Fund competition and the need to coordinate the conduct of expert evaluation resulted in founding of the Office for Research Evaluation in 2013. The Office’s primary objective is management of research evaluation of science projects. The standards to conduct the evaluation are defined as follows: - The exercise of the process approach, the unification of the functioning of department. - The uniformity of regulatory, organizational and methodological framework. - The development of proper on-line evaluation system. - The broad involvement of external Russian and international experts, the renouncement of the usage of own employees. - The development of an algorithm to make a correspondence between experts and science projects. - The methodical usage of opened/closed international and Russian databases to extend the expert database. - The transparency of evaluation results – free access to assessment while keeping experts confidentiality. The management of research evaluation of projects is based on the sole standard, organization and financing. The standard way of conducting research evaluation at HSE is based upon Regulations on basic principles for research evaluation at HSE. These Regulations have been developed from the moment of establishment of the Office for Research Evaluation and are based on conventional corporate standards for regulatory document management. The management system of research evaluation is implemented on the process approach basis. Process approach means deployment of work as a process, which is the aggregation of interrelated and interacting activities processing inputs into outputs. Inputs are firstly client asking for the assessment to be conducted, defining the conditions for organizing and carrying of the assessment and secondly the applicant with proper for the competition application; output is assessment given to the client. While exercising process approach to clarify interrelation and interacting main parties or subjects of the assessment are determined and the way for interaction between them forms up. Parties to expert assessment are: - Ordering Party – The department of the university taking the decision to subject a project to expert assessment; - Providing Party – The department of the university authorized to provide such assessment by the Ordering Party; - Performing Party – The legal and natural entities that have expertise in the area of research evaluation. Experts assess projects in accordance with criteria and states of expert opinions approved by the Ordering Party. Objects of assessment generally are applications or HSE competition project reports. Mainly assessments are deployed for internal needs, i.e. the most ordering parties are HSE branches and departments, but assessment can also be conducted for external clients. The financing of research evaluation at HSE is based on the established corporate culture and traditions of HSE.Keywords: expert assessment, management of research evaluation, process approach, research evaluation
Procedia PDF Downloads 25320084 Cadaveric Study of Lung Anatomy: A Surgical Overview
Authors: Arthi Ganapathy, Rati Tandon, Saroj Kaler
Abstract:
Introduction: A thorough knowledge of variations in lung anatomy is of prime significance during surgical procedures like lobectomy, pneumonectomy, and segmentectomy of lungs. The arrangement of structures in the lung hilum act as a guide in performing such procedures. The normal pattern of arrangement of hilar structures in the right lung is eparterial bronchus, pulmonary artery, hyparterial bronchus and pulmonary veins from above downwards. In the left lung, it is pulmonary artery, principal bronchus and pulmonary vein from above downwards. The arrangement of hilar structures from anterior to posterior in both the lungs is pulmonary vein, pulmonary artery, and principal bronchus. The bronchial arteries are very small and usually the posterior most structures in the hilum of lungs. Aim: The present study aims at reporting the variations in hilar anatomy (arrangement and number) of lungs. Methodology: 75 adult formalin fixed cadaveric lungs from the department of Anatomy AIIMS New Delhi were observed for variations in the lobar anatomy. Arrangement of pulmonary hilar structures was meticulously observed, and any deviation in the pattern of presentation was recorded. Results: Among the 75 adult lung specimens observed 36 specimens were of right lung and the rest of left lung. Seven right lung specimens showed only 2 lobes with an oblique fissure dividing them and one left lung showed 3 lobes. The normal pattern of arrangement of hilar structures was seen in 22 right lungs and 23 left lungs. Rest of the lung specimens (14 right and 16 left) showed a varied pattern of arrangement of hilar structures. Some of them showed alterations in the sequence of arrangement of pulmonary artery, pulmonary veins, bronchus, and others in the number of these structures. Conclusion: Alterations in the pattern of arrangement of structures in the lung hilum are quite frequent. A compromise in knowledge of such variations will result in inadvertent complications like intraoperative bleeding during surgical procedures.Keywords: fissures, hilum, lobes, pulmonary
Procedia PDF Downloads 22420083 Distributed Manufacturing (DM)- Smart Units and Collaborative Processes
Authors: Hermann Kuehnle
Abstract:
Developments in ICT totally reshape manufacturing as machines, objects and equipment on the shop floors will be smart and online. Interactions with virtualizations and models of a manufacturing unit will appear exactly as interactions with the unit itself. These virtualizations may be driven by providers with novel ICT services on demand that might jeopardize even well established business models. Context aware equipment, autonomous orders, scalable machine capacity or networkable manufacturing unit will be the terminology to get familiar with in manufacturing and manufacturing management. Such newly appearing smart abilities with impact on network behavior, collaboration procedures and human resource development will make distributed manufacturing a preferred model to produce. Computing miniaturization and smart devices revolutionize manufacturing set ups, as virtualizations and atomization of resources unwrap novel manufacturing principles. Processes and resources obey novel specific laws and have strategic impact on manufacturing and major operational implications. Mechanisms from distributed manufacturing engaging interacting smart manufacturing units and decentralized planning and decision procedures already demonstrate important effects from this shift of focus towards collaboration and interoperability.Keywords: autonomous unit, networkability, smart manufacturing unit, virtualization
Procedia PDF Downloads 52620082 Comics Scanlation and Publishing Houses Translation
Authors: Sharifa Alshahrani
Abstract:
Comics is a multimodal text wherein meaning is created by taking in all modes of expression at once. It uses two different semiotic modes, the verbal and the visual modes, together to make meaning and these different semiotic modes can be socially and culturally shaped to give meaning. Therefore, comics translation cannot treat comics as a monomodal text by translating only the verbal mode inside or outside the speech balloons as the cultural differences are encoded in the visual mode as well. Due to the development of the internet and editing software, comics translation is not anymore confined to the publishing houses and official translation as scanlation, or the fan translation took the initiative in translating comics for being emotionally attracted to the culture and genre. Scanlation is carried out by volunteering fans who translate out of passion. However, quality is one of the debatable issues relating to scanlation and fan translation. This study will investigate how the dynamic multimodal relationship in comics is exploited and interpreted in the translation by exploring the translation strategies and procedures adopted by the publishing houses and scanlation in interpreting comics into Arabic using three analytical frameworks; cultural references model, multimodal relation model and translation strategies and procedures models.Keywords: comics, multimodality, translation, scanlation
Procedia PDF Downloads 21220081 Simulation of the Flow in a Circular Vertical Spillway Using a Numerical Model
Authors: Mohammad Zamani, Ramin Mansouri
Abstract:
Spillways are one of the most important hydraulic structures of dams that provide the stability of the dam and downstream areas at the time of flood. A circular vertical spillway with various inlet forms is very effective when there is not enough space for the other spillway. Hydraulic flow in a vertical circular spillway is divided into three groups: free, orifice, and under pressure (submerged). In this research, the hydraulic flow characteristics of a Circular Vertical Spillway are investigated with the CFD model. Two-dimensional unsteady RANS equations were solved numerically using Finite Volume Method. The PISO scheme was applied for the velocity-pressure coupling. The mostly used two-equation turbulence models, k-ε and k-ω, were chosen to model Reynolds shear stress term. The power law scheme was used for the discretization of momentum, k, ε, and ω equations. The VOF method (geometrically reconstruction algorithm) was adopted for interface simulation. In this study, three types of computational grids (coarse, intermediate, and fine) were used to discriminate the simulation environment. In order to simulate the flow, the k-ε (Standard, RNG, Realizable) and k-ω (standard and SST) models were used. Also, in order to find the best wall function, two types, standard wall, and non-equilibrium wall function, were investigated. The laminar model did not produce satisfactory flow depth and velocity along the Morning-Glory spillway. The results of the most commonly used two-equation turbulence models (k-ε and k-ω) were identical. Furthermore, the standard wall function produced better results compared to the non-equilibrium wall function. Thus, for other simulations, the standard k-ε with the standard wall function was preferred. The comparison criterion in this study is also the trajectory profile of jet water. The results show that the fine computational grid, the input speed condition for the flow input boundary, and the output pressure for the boundaries that are in contact with the air provide the best possible results. Also, the standard wall function is chosen for the effect of the wall function, and the turbulent model k-ε (Standard) has the most consistent results with experimental results. When the jet gets closer to the end of the basin, the computational results increase with the numerical results of their differences. The mesh with 10602 nodes, turbulent model k-ε standard and the standard wall function, provide the best results for modeling the flow in a vertical circular Spillway. There was a good agreement between numerical and experimental results in the upper and lower nappe profiles. In the study of water level over crest and discharge, in low water levels, the results of numerical modeling are good agreement with the experimental, but with the increasing water level, the difference between the numerical and experimental discharge is more. In the study of the flow coefficient, by decreasing in P/R ratio, the difference between the numerical and experimental result increases.Keywords: circular vertical, spillway, numerical model, boundary conditions
Procedia PDF Downloads 8620080 A Minimally Invasive Approach Using Bio-Miniatures Implant System for Full Arch Rehabilitation
Authors: Omid Allan
Abstract:
The advent of ultra-narrow diameter implants initially offered an alternative to wider conventional implants. However, their design limitations have restricted their applicability primarily to overdentures and cement-retained fixed prostheses, often with unpredictable long-term outcomes. The introduction of the new Miniature Implants has revolutionized the field of implant dentistry, leading to a more streamlined approach. The utilization of Miniature Implants has emerged as a promising alternative to the traditional approach that entails the traumatic sequential bone drilling procedures and the use of conventional implants for full and partial arch restorations. The innovative "BioMiniatures Implant System serves as a groundbreaking bridge connecting mini implants with standard implant systems. This system allows practitioners to harness the advantages of ultra-small implants, enabling minimally invasive insertion and facilitating the application of fixed screw-retained prostheses, which were only available to conventional wider implant systems. This approach streamlines full and partial arch rehabilitation with minimal or even no bone drilling, significantly reducing surgical risks and complications for clinicians while minimizing patient morbidity. The ultra-narrow diameter and self-advancing features of these implants eliminate the need for invasive and technically complex procedures such as bone augmentation and guided bone regeneration (GBR), particularly in cases involving thin alveolar ridges. Furthermore, the absence of a microcap between the implant and abutment eliminates the potential for micro-leakage and micro-pumping effects, effectively mitigating the risk of marginal bone loss and future peri-implantitis. The cumulative experience of restoring over 50 full and partial arch edentulous cases with this system has yielded an outstanding success rate exceeding 97%. The long-term success with a stable marginal bone level in the study firmly establishes these implants as a dependable alternative to conventional implants, especially for full arch rehabilitation cases. Full arch rehabilitation with these implants holds the promise of providing a simplified solution for edentulous patients who typically present with atrophic narrow alveolar ridges, eliminating the need for extensive GBR and bone augmentation to restore their dentition with fixed prostheses.Keywords: mini-implant, biominiatures, miniature implants, minimally invasive dentistry, full arch rehabilitation
Procedia PDF Downloads 7420079 Extraction, Characterization and Application of Natural Dyes from the Fresh Rind of Index Colour 5 Mangosteen (Garcinia mangostana L.)
Authors: Basitah Taif
Abstract:
This study was to explore and utilize the fresh rind of mangosteen Index Colour 5 as an upcoming raw material for the production of natural dyes. Rind from the fresh mangosteen Index Colour 5 was utilized to extract the dyes. The established extracts were experimented on silk fabrics via three types of mordanting and dyeing procedures; pre-mordanting, simultaneous mordanting and post-mordanting. As a result, the applications of the freeze-drying methodology and mechanizable equipment have helped to produce excellent range of natural colours. Silk fabric treated simultaneously with mordanting and dyeing with extract dye Index Colour 5 produced a brilliant shade of the red colour and the colour from this index is also discovered sensitive to light and washing during the fastness tests. The preliminary evaluation and instrumentation analysis allowed us to examine whether the application of different mordanting and dyeing procedures with the same extract samples and concentrations affected the colours and shades of the fabric samples.Keywords: natural dye, freeze-drying, Garcinia mangostana Linn, mordanting
Procedia PDF Downloads 46220078 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring
Authors: Seung-Lock Seo
Abstract:
This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.Keywords: data mining, process data, monitoring, safety, industrial processes
Procedia PDF Downloads 40120077 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 15920076 AI Peer Review Challenge: Standard Model of Physics vs 4D GEM EOS
Authors: David A. Harness
Abstract:
Natural evolution of ATP cognitive systems is to meet AI peer review standards. ATP process of axiom selection from Mizar to prove a conjecture would be further refined, as in all human and machine learning, by solving the real world problem of the proposed AI peer review challenge: Determine which conjecture forms the higher confidence level constructive proof between Standard Model of Physics SU(n) lattice gauge group operation vs. present non-standard 4D GEM EOS SU(n) lattice gauge group spatially extended operation in which the photon and electron are the first two trace angular momentum invariants of a gravitoelectromagnetic (GEM) energy momentum density tensor wavetrain integration spin-stress pressure-volume equation of state (EOS), initiated via 32 lines of Mathematica code. Resulting gravitoelectromagnetic spectrum ranges from compressive through rarefactive of the central cosmological constant vacuum energy density in units of pascals. Said self-adjoint group operation exclusively operates on the stress energy momentum tensor of the Einstein field equations, introducing quantization directly on the 4D spacetime level, essentially reformulating the Yang-Mills virtual superpositioned particle compounded lattice gauge groups quantization of the vacuum—into a single hyper-complex multi-valued GEM U(1) × SU(1,3) lattice gauge group Planck spacetime mesh quantization of the vacuum. Thus the Mizar corpus already contains all of the axioms required for relevant DeepMath premise selection and unambiguous formal natural language parsing in context deep learning.Keywords: automated theorem proving, constructive quantum field theory, information theory, neural networks
Procedia PDF Downloads 17920075 Impact of Calcium Carbide Waste Dumpsites on Soil Chemical and Microbial Characteristics
Authors: C. E. Ihejirika, M. I. Nwachukwu, R. F. Njoku-Tony, O. C. Ihejirika, U. O. Enwereuzoh, E. O. Imo, D. C. Ashiegbu
Abstract:
Disposal of industrial solid wastes in the environment is a major environmental challenge. This study investigated the effects of calcium carbide waste dumpsites on soil quality. Soil samples were collected with hand auger from three different dumpsites at varying depths and made into composite samples. Samples were subjected to standard analytical procedures. pH varied from 10.38 to 8.28, nitrate from 5.6mg/kg to 9.3mg/kg, phosphate from 8.8mg/kg to 12.3mg/kg, calcium carbide reduced from 10% to to 3%. Calcium carbide was absent in control soil samples. Bacterial counts from dumpsites ranged from 1.8 x 105cfu/g - 2.5 x 105cfu/g while fungal ranged from 0.8 x 103cfu/g - 1.4 x 103cfu/g. Bacterial isolates included Pseudomonas spp, Flavobacterium spp, and Achromobacter spp, while fungal isolates include Penicillium notatum, Aspergillus niger, and Rhizopus stolonifer. No organism was isolated from the dumpsites at soil depth of 0-15 cm, while there were isolates from other soil depths. Toxicity might be due to alkaline condition of the dumpsite. Calcium carbide might be bactericidal and fungicidal leading to cellular physiology, growth retardation, death, general loss of biodiversity and reduction of ecosystem processes. Detoxification of calcium carbide waste before disposal on soil might be the best option in management.Keywords: biodiversity, calcium-carbide, denitrification, toxicity
Procedia PDF Downloads 54620074 Effects of Dimensional Sizes of Mould on the Volumetric Shrinkage Strain of Lateric Soil
Authors: John E. Sani, Moses George
Abstract:
The paper presents the result of a laboratory study carried out on lateritic soil to determine the effects of dimensional size on the volumetric shrinkage strain (VSS) using three mould sizes i.e. split former mould, proctor mould and California bearing ratio (CBR) mould at three energy levels; British standard light (BSL), West African standard (WAS) and British standard heavy (BSH) respectively. Compactions were done at different molding water content of -2 % to +6 % optimum moisture content (OMC). At -2% to +2% molding water content for the split former mould the volumetric shrinkage strain met the requirement of not more than 4% while at +4% and +6% only the WAS and BSH met the requirement. The proctor mould and the CBR mould on the other hand gave a lower value of volumetric shrinkage strain in all compactive effort and the values are lower than the 4% safe VSS value.Keywords: lateritic soil, volumetric shrinkage strain, molding water content, compactive effort
Procedia PDF Downloads 53220073 Increased Stability of Rubber-Modified Asphalt Mixtures to Swelling, Expansion and Rebound Effect during Post-Compaction
Authors: Fernando Martinez Soto, Gaetano Di Mino
Abstract:
The application of rubber into bituminous mixtures requires attention and care during mixing and compaction. Rubber modifies the properties because it reacts in the internal structure of bitumen at high temperatures changing the performance of the mixture (interaction process of solvents with binder-rubber aggregate). The main change is the increasing of the viscosity and elasticity of the binder due to the larger sizes of the rubber particles by dry process but, this positive effect is counteracted by short mixing times, compared to wet technology, and due to the transport processes, curing time and post-compaction of the mixtures. Therefore, negative effects as swelling of rubber particles, rebounding effect of the specimens and thermal changes by different expansion of the structure inside the mixtures, can change the mechanical properties of the rubberized blends. Based on the dry technology, different asphalt-rubber binders using devulcanized or natural rubber (truck and bus tread rubber), have served to demonstrate these effects and how to solve them into two dense-gap graded rubber modified asphalt concrete mixes (RUMAC) to enhance the stability, workability and durability of the compacted samples by Superpave gyratory compactor method. This paper specifies the procedures developed in the Department of Civil Engineering of the University of Palermo during September 2016 to March 2017, for characterizing the post-compaction and mix-stability of the one conventional mixture (hot mix asphalt without rubber) and two gap-graded rubberized asphalt mixes according granulometry for rail sub-ballast layers with nominal size of Ø22.4mm of aggregates according European standard. Thus, the main purpose of this laboratory research is the application of ambient ground rubber from scrap tires processed at conventional temperature (20ºC) inside hot bituminous mixtures (160-220ºC) as a substitute for 1.5%, 2% and 3% by weight of the total aggregates (3.2%, 4.2% and, 6.2% respectively by volumetric part of the limestone aggregates of bulk density equal to 2.81g/cm³) considered, not as a part of the asphalt binder. The reference bituminous mixture was designed with 4% of binder and ± 3% of air voids, manufactured for a conventional bitumen B50/70 at 160ºC-145ºC mix-compaction temperatures to guarantee the workability of the mixes. The proportions of rubber proposed are #60-40% for mixtures with 1.5 to 2% of rubber and, #20-80% for mixture with 3% of rubber (as example, a 60% of Ø0.4-2mm and 40% of Ø2-4mm). The temperature of the asphalt cement is between 160-180 ºC for mixing and 145-160 ºC for compaction, according to the optimal values for viscosity using Brookfield viscometer and 'ring and ball' - penetration tests. These crumb rubber particles act as a rubber-aggregate into the mixture, varying sizes between 0.4mm to 2mm in a first fraction, and 2-4mm as second proportion. Ambient ground rubber with a specific gravity of 1.154g/cm³ is used. The rubber is free of loose fabric, wire, and other contaminants. It was found optimal results in real beams and cylindrical specimens with each HMA mixture reducing the swelling effect. Different factors as temperature, particle sizes of rubber, number of cycles and pressures of compaction that affect the interaction process are explained.Keywords: crumb-rubber, gyratory compactor, rebounding effect, superpave mix-design, swelling, sub-ballast railway
Procedia PDF Downloads 24320072 Applying Biosensors’ Electromyography Signals through an Artificial Neural Network to Control a Small Unmanned Aerial Vehicle
Authors: Mylena McCoggle, Shyra Wilson, Andrea Rivera, Rocio Alba-Flores
Abstract:
This work introduces the use of EMGs (electromyography) from muscle sensors to develop an Artificial Neural Network (ANN) for pattern recognition to control a small unmanned aerial vehicle. The objective of this endeavor exhibits interfacing drone applications beyond manual control directly. MyoWare Muscle sensor contains three EMG electrodes (dual and single type) used to collect signals from the posterior (extensor) and anterior (flexor) forearm and the bicep. Collection of raw voltages from each sensor were connected to an Arduino Uno and a data processing algorithm was developed with the purpose of interpreting the voltage signals given when performing flexing, resting, and motion of the arm. Each sensor collected eight values over a two-second period for the duration of one minute, per assessment. During each two-second interval, the movements were alternating between a resting reference class and an active motion class, resulting in controlling the motion of the drone with left and right movements. This paper further investigated adding up to three sensors to differentiate between hand gestures to control the principal motions of the drone (left, right, up, and land). The hand gestures chosen to execute these movements were: a resting position, a thumbs up, a hand swipe right motion, and a flexing position. The MATLAB software was utilized to collect, process, and analyze the signals from the sensors. The protocol (machine learning tool) was used to classify the hand gestures. To generate the input vector to the ANN, the mean, root means squared, and standard deviation was processed for every two-second interval of the hand gestures. The neuromuscular information was then trained using an artificial neural network with one hidden layer of 10 neurons to categorize the four targets, one for each hand gesture. Once the machine learning training was completed, the resulting network interpreted the processed inputs and returned the probabilities of each class. Based on the resultant probability of the application process, once an output was greater or equal to 80% of matching a specific target class, the drone would perform the motion expected. Afterward, each movement was sent from the computer to the drone through a Wi-Fi network connection. These procedures have been successfully tested and integrated into trial flights, where the drone has responded successfully in real-time to predefined command inputs with the machine learning algorithm through the MyoWare sensor interface. The full paper will describe in detail the database of the hand gestures, the details of the ANN architecture, and confusion matrices results.Keywords: artificial neural network, biosensors, electromyography, machine learning, MyoWare muscle sensors, Arduino
Procedia PDF Downloads 17420071 Evaluating Forecasts Through Stochastic Loss Order
Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio
Abstract:
We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test
Procedia PDF Downloads 8920070 Trends in Domestic Terms of Trade of Agricultural Sector of Pakistan
Authors: Anwar Hussain, Muhammad Iqbal
Abstract:
The changes in the prices of the agriculture commodities combined with changes in population and agriculture productivity affect farmers’ profitability and standard of living. This study intends to estimate various domestic terms of trade for agriculture sector and also to assess the volatility in the standard of living and profitability of farmers. The terms of trade has been estimated for Pakistan and its provinces using producer prices indices, consumer price indices, input prices indices and quantity indices using the data for the period 1990-91 to 2008-09. The domestic terms of trade of agriculture sector has been improved in terms of both approaches i.e. the ratio of producer prices indices to consumer prices indices and the real per capita income approach. However, the cross province estimates indicated that the terms of trade also improved for Khyber Pakhtunkhwa, Sindh and Punjab while Balochistan’s domestic terms of trade deteriorated drastically. In other words the standard of living of the farmers in Pakistan and its provinces except Balochistan improved. Using the input prices, the domestic terms of trade deteriorated for Pakistan as a whole and its provinces as well. This also explores that as a whole the profitability of the farmers reduced during the study period. The farmers pay more prices for inputs as compared to they receive for their produce. This further indicates that the poverty at the gross root level has been increased. Further, summing, the standard of living of the farmers improved but their profitability reduced, which indicates that the farmers do not completely rely on the farm income but also utilize some other sources of income for their livelihood. The study supports to give subsidies on farm inputs so as to improve the profitability of the farmers.Keywords: agricultural terms of trade, farmers’ profitability, farmers’ standard of living, consumer and producer price indices, quantity indices
Procedia PDF Downloads 466