Search results for: laboratory and commercial scales
5252 Measuring Firms’ Patent Management: Conceptualization, Validation, and Interpretation
Authors: Mehari Teshome, Lara Agostini, Anna Nosella
Abstract:
The current knowledge-based economy extends intellectual property rights (IPRs) legal research themes into a more strategic and organizational perspectives. From the diverse types of IPRs, patents are the strongest and well-known form of legal protection that influences commercial success and market value. Indeed, from our pilot survey, we understood that firms are less likely to manage their patents and actively used it as a tool for achieving competitive advantage rather they invest resource and efforts for patent application. To this regard, the literature also confirms that insights into how firms manage their patents from a holistic, strategic perspective, and how the portfolio value of patents can be optimized are scarce. Though patent management is an important business tool and there exist few scales to measure some dimensions of patent management, at the best of our knowledge, no systematic attempt has been made to develop a valid and comprehensive measure of it. Considering this theoretical and practical point of view, the aim of this article is twofold: to develop a framework for patent management encompassing all relevant dimensions with their respective constructs and measurement items, and to validate the measurement using survey data from practitioners. Methodology: We used six-step methodological approach (i.e., specify the domain of construct, item generation, scale purification, internal consistency assessment, scale validation, and replication). Accordingly, we carried out a systematic review of 182 articles on patent management, from ISI Web of Science. For each article, we mapped relevant constructs, their definition, and associated features, as well as items used to measure these constructs, when provided. This theoretical analysis was complemented by interviews with experts in patent management to get feedbacks that are more practical on how patent management is carried out in firms. Afterwards, we carried out a questionnaire survey to purify our scales and statistical validation. Findings: The analysis allowed us to design a framework for patent management, identifying its core dimensions (i.e., generation, portfolio-management, exploitation and enforcement, intelligence) and support dimensions (i.e., strategy and organization). Moreover, we identified the relevant activities for each dimension, as well as the most suitable items to measure them. For example, the core dimension generation includes constructs as: state-of-the-art analysis, freedom-to-operate analysis, patent watching, securing freedom-to-operate, patent potential and patent-geographical-scope. Originality and the Study Contribution: This study represents a first step towards the development of sound scales to measure patent management with an overarching approach, thus laying the basis for developing a recognized landmark within the research area of patent management. Practical Implications: The new scale can be used to assess the level of sophistication of the patent management of a company and compare it with other firms in the industry to evaluate their ability to manage the different activities involved in patent management. In addition, the framework resulting from this analysis can be used as a guide that supports managers to improve patent management in firms.Keywords: patent, management, scale, development, intellectual property rights (IPRs)
Procedia PDF Downloads 1495251 Learning in the Virtual Laboratory via Design of Automation Process for Wooden Hammers Marking
Authors: A. Javorova, J. Oravcova, K. Velisek
Abstract:
The article summarizes the experience of technical subjects teaching methodologies using a number of software products to solve specific assigned tasks described in this paper. Task is about the problems of automation and mechanization in the industry. Specifically, it focuses on introducing automation in the wood industry. The article describes the design of the automation process for marking wooden hammers. Similar problems are solved by students in CA laboratory.Keywords: CA system, education, simulation, subject
Procedia PDF Downloads 2965250 Interior Outdoors of Tomorrow: A Study on the Rising Influence of the 'Interior' Vocabulary in the Design of Outdoor Spaces and the Fading Role of the Architectural Discourse
Authors: Massimo Imparato
Abstract:
The study aims to identify the background of the contemporary trends in the design of commercial outdoors, and the reasons for the radical change in the traditional relationship between architecture and interior design, where the latter is taking over the construction of the visual narrative framing the users’ experience, which was ruled in the past by the architectural discourse. The design of commercial interiors, in fact, influences the way in which their outdoor spaces are organized and used more than ever before, and reflects the multi-faceted changes in the consumers’ behaviors and their interaction with the built environment. The study starts with the analysis of the evolution of sheltered outdoor spaces to achieve a broader understanding of the shift of meaning of subjects such as private and public domains, and to consider the varied ways of interaction/integration between the building and its exterior space. The study identifies the major social, physical and cultural aspects influencing the design of contemporary commercial outdoor spaces, suggests a new framework for their understanding and draws the methodological guidelines for the development of a structured approach to the design of commercial outdoors. The purpose of the paper is to stress the influence of the design of interiors into the public realm, to indicate new directions in this field of research, and to provide new methodological tools for interior design professionals.Keywords: interior design, landscape design, visual narrative, outdoor design
Procedia PDF Downloads 2995249 Challenges for Implementing Standards Compliant with Iso/Iec 17025, for Narcotics and DNA Laboratory’s
Authors: Blerim Olluri
Abstract:
A forensic science laboratory in Kosovo has never been organized at the level of most modern forensic science laboratories. This was made possible after the war of 1999 with the help and support from the United States. The United States Government/ICITAP provided 9.5 million dollars to support this project, this support have greatly benefitted law enforcement in Kosovo. With the establishment of Operative Procedures of Work and the law for Kosovo Agency of Forensic, the accreditation with ISO/IEC 17025 of the KAF labs it becomes mandatory. Since 2012 Laboratory’s DNA/Serology and Narcotics has begun reviewing and harmonizing their procedures according to ISO/IEC 17025. The focus of this work was to create quality manuals, procedures, work instructions, quality documentation and quality records. Furthermore, during this time is done the validation of work methods from scientific qualified personnel of KAF, without any help from other foreign agencies or accreditation body.In October 2014 we had the first evaluation based on ISO 17025 standards. According to the initial report of this assessment we have non conformity in test and Calibration methods method’s, and accommodation and environmental conditions. We identified several issues that are of extreme importance to KAF. One the most important issue is to create a professional group with experts of KAF, which will work in all the obligations, requested from ISO/IEC 17025. As conclusions that we earn in this path of accreditation, are that laboratory’s need to take corrective action, and all nonconformance’s must be addressed and corrective action taken before accreditation can be granted.Keywords: accreditation, assessment, narcotics, DNA
Procedia PDF Downloads 3645248 Protection of Stakeholders under the Transitional Commercial Code of Eritrea: Comparative Analysis with the 2018 Company Law of Peoples Republic of China
Authors: Hayle Makda Gebru
Abstract:
Companies are inevitable for society. They are the building blocks of every development in a country aimed at producing continuous goods and services for the people and, in turn, obliged to pay taxes, which enhances the economy of the nation. For the proper functioning of companies, their relationship with their stakeholders must be secure. The major stakeholders are suppliers, consumers, employees, creditors, etc. The law plays an important role in enhancing the relationship between these different stakeholders. If the law fails to keep track of the relationship, both the company and stakeholders remain unprotected. As a result, the potential benefits are prejudiced. This paper makes a comparative analysis of the types and formation of companies under the Transitional Commercial Code of Eritrea and the Company Law of the Peoples Republic of China. In particular, the paper addresses the legal lacuna under the TCrCE on handling the failure of shareholders to pay the promised capital. So, the methodology of the study is entirely analyzing the two countries' laws using practical cases. After analyzing the practical problems on the ground using real cases, this paper calls on Eritrea to update its outdated Commercial Code to give proper protection to the stakeholders.Keywords: companies, company law of the People's Republic of China, transitional commercial code of Eritrea, protection of stakeholders, failure to pay the promised capital
Procedia PDF Downloads 715247 Urban Laboratory for Community Involvement in Urban Design Process
Authors: Anja Jutraz, Tadeja Zupancic
Abstract:
This article explores urban laboratory, which presents a combination of different physical and digital methods and tools for public participation in urban design. The city consists of built and unbuilt environments, which can be defined as a community of people, who live there. Communities should have the option to express opinions and decide about the future of their city, from the early stages of the design process onwards. In this paper, we presented the possibility of involving community into renewal of Banska Štiavnica in Slovakia (more exactly the old mining shaft and lake Michal Štolna) and the methods to promote the community building. As a case study we presented the eTHNo project, Education about Technical, Historical and Natural opportunities of Michal Štolna. Moreover, we discussed the possibility of using virtual digital tools for public participation in urban design, where we especially focused on Virtual Urban Laboratory, VuLab.Keywords: community building, digital tools, public participation, urban design
Procedia PDF Downloads 5745246 Preliminary Prospecting on the Distribution of the Disease of Citrus Tristeza Orchards in the Province of Chlef
Authors: Ibrahim Djelloul Berkane
Abstract:
A survey was conducted to assess the presence of the virus in Citrus tristeza one of the main citrus regions of Algeria, namely the Chlef region, using the technique of Direct Tissue Print Immunoprinting Assay (DTBIA) and the Double Sandwich ELISA antibodies. A nursery citrus, lumber yards, and commercial orchards, which are the main varieties cultivated citrus were subjected to samples collected samples for laboratory analysis. 0.91% of the plants tested orchards were infected with CTV, while no positive case was detected at the nursery the yard, however, it is reported that an alarming rate of 10,5% of orchards tested at the common Chettia were infected with tristeza virus. The investigation was launched to identify the vector species tristeza revealed the presence of a vector is important Aphis gossypii.Keywords: aphis, chlef, citrus, DAS-ELISA, DTBIA, tristeza
Procedia PDF Downloads 3035245 The Relationship Between The Two-spatial World And The Decrease In The Area Of Commercial Properties
Authors: Syedhossein Vakili
Abstract:
According to the opinion of some experts, the world's two-spatialization means the establishment of a new virtual space and placing this new space next to the physical space. This dualization of space has left various effects, one of which is reducing the need for buildings and making the area of business premises economical through the use of virtual space instead of a part of physical space. In such a way that before the virtual space was known, a commercial or educational institution had to block a large part of its capital to acquire physical spaces and buildings in order to provide physical space and places needed for daily activities, but today, Thanks to the addition of the virtual space to the physical space, it has been possible to carry out its activities more widely in a limited environment with a minimum of physical space and drastically reduce costs. In order to understand the impact of virtual space on the reduction of physical space, the researcher used the official reports of the countries regarding the average area mentioned in the permits for the construction of commercial and educational units in the period from 2014 to 2023 and compared the average capital required for the absolute physical period with The period of two-spatialization of the world in the mentioned ten-year period, while using the analytical and comparative method, has proven that virtual space has greatly reduced the amount of investment of business owners to provide the required place for their activities by reducing the need for physical space. And economically, it has made commercial activities more profitable.Keywords: two spatialization, building area, cyberspace, physical space, virtual place
Procedia PDF Downloads 625244 Analyzing Music Theory in Different Countries: Compare with Greece and China
Authors: Baoshan Wang
Abstract:
The present study investigates how music theory has developed across different countries due to their diverse histories, religions, and cultural differences. It is unknown how these various factors may contribute to differences in music theory across countries. Therefore, we examine the differences between China and Greece, which have developed unique music theories over time. Specifically, our analysis looks at musical notation and scales. For example, Tonal music originates from Greece, which harbors quite complex notation and scaling. There exist seven notes in each scale within seven modes of scales. Each mode of the diatonic scale has a unique temperament, two of which are most commonly used in modern music. In contrast, we find that Chinese music has only five notes in its scales. Interestingly, a unique feature of Chinese music theory is that there is no half-step, resulting in a highly divergent and culture-specific sound. Fascinatingly, these differences may arise from the contrasting ways that Western and Eastern musicians perceive music. While Western musicians tend to believe in music “without borders,” Eastern musicians generally embrace differing perspectives. Yet, the vast majority of colleges or music conservatories teach the borderless theory of Western music, which renders the music educational system incomplete. This is critically important because learning music is not simply a profession for musicians. Rather, it is an intermediary to facilitate understanding and appreciation for different countries’ cultures and religions. Education is undoubtedly the optimal mode to promote different countries’ music theory so people across the world can learn more about music and, in turn, each other. Even though Western music theory is predominantly taught, it is crucial we also pursue an understanding of other countries’ music because their unique aspects contribute to the systematic completeness of Music Theory in its entirety.Keywords: culture, development, music theory, music history, religion, western music
Procedia PDF Downloads 955243 Establish a Company in Turkey for Foreigners
Authors: Mucahit Unal, Ibrahim Arslan
Abstract:
The New Turkish Commercial Code (TCC) No. 6102 was published in the Official Gazette on February 14, 2011. As stated in the New Turkish Commercial Code No. 6102 and Law No. 6103 on Validity and Application of the Turkish Commercial Code, TCC came into effect on July 1, 2012. The basic purpose of the TCC is to form corporate governance coherent with the international standards; to provide transparency in company management; to adjust the Turkish Commercial Code rules with European Union legislations and to simplify establishing a company for foreigner investors to move investments to Turkish market. In this context according to TCC, joint stock companies and limited liability companies can establish with only one single shareholder; the one single shareholder can be foreigner; all board of director members can be foreigner, also all shareholders and board of director members can be non-resident foreigners. Additionally, TCC does not require physical participation to the general shareholders and board members meetings. TCC allows that the general shareholders and board members meetings can hold in an electronic form and resolution of these meetings may also be approved via electronic signatures. Through this amendment, foreign investors no longer have to deal with red tapes. This amendment also means the TCC prevents foreign companies from incurring unnecessary travel expenses. In accordance with all this amendments about TCC, to invest in Turkish market is easy, simple and transparent for foreigner investors and also investors can establish a company in Turkey, irrespective of nationality or place of residence. This article aims to analyze ‘Establish a Company in Turkey for Foreigners’ and inform investors about investing (especially establishing a company) in the Turkish market.Keywords: establish a company, foreigner investors, invest in Turkish market, Turkish commercial code
Procedia PDF Downloads 2645242 Predicting Wearable Technology Readiness in a South African Government Department: Exploring the Influence of Wearable Technology Acceptance and Positive Attitude
Authors: Henda J Thomas, Cornelia PJ Harmse, Cecile Schultz
Abstract:
Wearables are one of the technologies that will flourish within the fourth industrial revolution and digital transformation arenas, allowing employers to integrate collected data into organisational information systems. The study aimed to investigate whether wearable technology readiness can predict employees’ acceptance to wear wearables in the workplace. The factors of technology readiness predisposition that predict acceptance and positive attitudes towards wearable use in the workplace were examined. A quantitative research approach was used. The population consisted of 8 081 South African Department of Employment and Labour employees (DEL). Census sampling was used, and questionnaires to collect data were sent electronically to all 8 081 employees, 351 questionnaires were received back. The measuring instrument called the Technology Readiness and Acceptance Model (TRAM) was used in this study. Four hypotheses were formulated to investigate the relationship between readiness and acceptance of wearables in the workplace. The results found consistent predictions of technology acceptance (TA) by eagerness, optimism, and discomfort in the technology readiness (TR) scales. The TR scales of optimism and eagerness were consistent positive predictors of the TA scales, while discomfort proved to be a negative predictor for two of the three TA scales. Insecurity was found not to be a predictor of TA. It was recommended that the digital transformation policy of the DEL should be revised. Wearables in the workplace should be embraced from the viewpoint of convenience, automation, and seamless integration with the DEL information systems. The empirical contribution of this study can be seen in the fact that positive attitude emerged as a factor that extends the TRAM. In this study, positive attitude is identified as a new dimension to the TRAM not found in the original TA model and subsequent studies of the TRAM. Furthermore, this study found that Perceived Usefulness (PU) and Behavioural Intention to Use and (BIU) could not be separated but formed one factor. The methodological contribution of this study can lead to the development of a Wearable Readiness and Acceptance Model (WRAM). To the best of our knowledge, no author has yet introduced the WRAM into the body of knowledge.Keywords: technology acceptance model, technology readiness index, technology readiness and acceptance model, wearable devices, wearable technology, fourth industrial revolution
Procedia PDF Downloads 895241 Impact of pH Control on Peptide Profile and Antigenicity of Whey Hydrolysates
Authors: Natalia Caldeira De Carvalho, Tassia Batista Pessato, Luis Gustavo R. Fernandes, Ricardo L. Zollner, Flavia Maria Netto
Abstract:
Protein hydrolysates are ingredients of enteral diets and hypoallergenic formulas. Enzymatic hydrolysis is the most commonly used method for reducing the antigenicity of milk protein. The antigenicity and physicochemical characteristics of the protein hydrolysates depend on the reaction parameters. Among them, pH has been pointed out as of the major importance. Hydrolysis reaction in laboratory scale is commonly carried out under controlled pH (pH-stat). However, from the industrial point of view, controlling pH during hydrolysis reaction may be infeasible. This study evaluated the impact of pH control on the physicochemical properties and antigenicity of the hydrolysates of whey proteins with Alcalase. Whey protein isolate (WPI) solutions containing 3 and 7 % protein (w/v) were hydrolyzed with Alcalase 50 and 100 U g-1 protein at 60°C for 180 min. The reactions were carried out under controlled and uncontrolled pH conditions. Hydrolyses performed under controlled pH (pH-stat) were initially adjusted and maintained at pH 8.5. Hydrolyses carried out without pH control were initially adjusted to pH 8.5. Degree of hydrolysis (DH) was determined by OPA method, peptides profile was evaluated by HPLC-RP, and molecular mass distribution by SDS-PAGE/Tricine. The residual α-lactalbumin (α-La) and β-lactoglobulin (β-Lg) concentrations were determined using commercial ELISA kits. The specific IgE and IgG binding capacity of hydrolysates was evaluated by ELISA technique, using polyclonal antibodies obtained by immunization of female BALB/c mice with α-La, β-Lg and BSA. In hydrolysis under uncontrolled pH, the pH dropped from 8.5 to 7.0 during the first 15 min, remaining constant throughout the process. No significant difference was observed between the DH of the hydrolysates obtained under controlled and uncontrolled pH conditions. Although all hydrolysates showed hydrophilic character and low molecular mass peptides, hydrolysates obtained with and without pH control exhibited different chromatographic profiles. Hydrolysis under uncontrolled pH released, predominantly, peptides between 3.5 and 6.5 kDa, while hydrolysis under controlled pH released peptides smaller than 3.5 kDa. Hydrolysis with Alcalase under all conditions studied decreased by 99.9% the α-La and β-Lg concentrations in the hydrolysates detected by commercial kits. In general, β-Lg concentrations detected in the hydrolysates obtained under uncontrolled pH were significantly higher (p<0.05) than those detected in hydrolysates produced with pH control. The anti-α-La and anti-β-Lg IgE and IgG responses to all hydrolysates decreased significantly compared to WPI. Levels of specific IgE and IgG to the hydrolysates were below 25 and 12 ng ml-1, respectively. Despite the differences in peptide composition and α-La and β-Lg concentrations, no significant difference was found between IgE and IgG binding capacity of hydrolysates obtained with or without pH control. These results highlight the impact of pH on the hydrolysates characteristics and their concentrations of antigenic protein. Divergence between the antigen detection by commercial ELISA kits and specific IgE and IgG binding response was found in this study. This result shows that lower protein detection does not imply in lower protein antigenicity. Thus, the use of commercial kits for allergen contamination analysis should be cautious.Keywords: allergy, enzymatic hydrolysis, milk protein, pH conditions, physicochemical characteristics
Procedia PDF Downloads 3035240 Performance of the New Laboratory-Based Algorithm for HIV Diagnosis in Southwestern China
Authors: Yanhua Zhao, Chenli Rao, Dongdong Li, Chuanmin Tao
Abstract:
The Chinese Centers for Disease Control and Prevention (CCDC) issued a new laboratory-based algorithm for HIV diagnosis on April 2016, which initially screens with a combination HIV-1/HIV-2 antigen/antibody fourth-generation immunoassay (IA) followed, when reactive, an HIV-1/HIV-2 undifferentiated antibody IA in duplicate. Reactive specimens with concordant results undergo supplemental tests with western blots, or HIV-1 nucleic acid tests (NATs) and non-reactive specimens with discordant results receive HIV-1 NATs or p24 antigen tests or 2-4 weeks follow-up tests. However, little data evaluating the application of the new algorithm have been reported to date. The study was to evaluate the performance of new laboratory-based HIV diagnostic algorithm in an inpatient population of Southwest China over the initial 6 months by compared with the old algorithm. Plasma specimens collected from inpatients from May 1, 2016, to October 31, 2016, are submitted to the laboratory for screening HIV infection performed by both the new HIV testing algorithm and the old version. The sensitivity and specificity of the algorithms and the difference of the categorized numbers of plasmas were calculated. Under the new algorithm for HIV diagnosis, 170 of the total 52 749 plasma specimens were confirmed as positively HIV-infected (0.32%). The sensitivity and specificity of the new algorithm were 100% (170/170) and 100% (52 579/52 579), respectively; while 167 HIV-1 positive specimens were identified by the old algorithm with sensitivity 98.24% (167/170) and 100% (52 579/52 579), respectively. Three acute HIV-1 infections (AHIs) and two early HIV-1 infections (EHIs) were identified by the new algorithm; the former was missed by old procedure. Compared with the old version, the new algorithm produced fewer WB-indeterminate results (2 vs. 16, p = 0.001), which led to fewer follow-up tests. Therefore, the new HIV testing algorithm is more sensitive for detecting acute HIV-1 infections with maintaining the ability to verify the established HIV-1 infections and can dramatically decrease the greater number of WB-indeterminate specimens.Keywords: algorithm, diagnosis, HIV, laboratory
Procedia PDF Downloads 4015239 The Processing of Context-Dependent and Context-Independent Scalar Implicatures
Authors: Liu Jia’nan
Abstract:
The default accounts hold the view that there exists a kind of scalar implicature which can be processed without context and own a psychological privilege over other scalar implicatures which depend on context. In contrast, the Relevance Theorist regards context as a must because all the scalar implicatures have to meet the need of relevance in discourse. However, in Katsos, the experimental results showed: Although quantitatively the adults rejected under-informative utterance with lexical scales (context-independent) and the ad hoc scales (context-dependent) at almost the same rate, adults still regarded the violation of utterance with lexical scales much more severe than with ad hoc scales. Neither default account nor Relevance Theory can fully explain this result. Thus, there are two questionable points to this result: (1) Is it possible that the strange discrepancy is due to other factors instead of the generation of scalar implicature? (2) Are the ad hoc scales truly formed under the possible influence from mental context? Do the participants generate scalar implicatures with ad hoc scales instead of just comparing semantic difference among target objects in the under- informative utterance? In my Experiment 1, the question (1) will be answered by repetition of Experiment 1 by Katsos. Test materials will be showed by PowerPoint in the form of pictures, and each procedure will be done under the guidance of a tester in a quiet room. Our Experiment 2 is intended to answer question (2). The test material of picture will be transformed into the literal words in DMDX and the target sentence will be showed word-by-word to participants in the soundproof room in our lab. Reading time of target parts, i.e. words containing scalar implicatures, will be recorded. We presume that in the group with lexical scale, standardized pragmatically mental context would help generate scalar implicature once the scalar word occurs, which will make the participants hope the upcoming words to be informative. Thus if the new input after scalar word is under-informative, more time will be cost for the extra semantic processing. However, in the group with ad hoc scale, scalar implicature may hardly be generated without the support from fixed mental context of scale. Thus, whether the new input is informative or not does not matter at all, and the reading time of target parts will be the same in informative and under-informative utterances. People’s mind may be a dynamic system, in which lots of factors would co-occur. If Katsos’ experimental result is reliable, will it shed light on the interplay of default accounts and context factors in scalar implicature processing? We might be able to assume, based on our experiments, that one single dominant processing paradigm may not be plausible. Furthermore, in the processing of scalar implicature, the semantic interpretation and the pragmatic interpretation may be made in a dynamic interplay in the mind. As to the lexical scale, the pragmatic reading may prevail over the semantic reading because of its greater exposure in daily language use, which may also lead the possible default or standardized paradigm override the role of context. However, those objects in ad hoc scale are not usually treated as scalar membership in mental context, and thus lexical-semantic association of the objects may prevent their pragmatic reading from generating scalar implicature. Only when the sufficient contextual factors are highlighted, can the pragmatic reading get privilege and generate scalar implicature.Keywords: scalar implicature, ad hoc scale, dynamic interplay, default account, Mandarin Chinese processing
Procedia PDF Downloads 3245238 Analysis of the Annual Proficiency Testing Procedure for Intermediate Reference Laboratories Conducted by the National Reference Laboratory from 2013 to 2017
Authors: Reena K., Mamatha H. G., Somshekarayya, P. Kumar
Abstract:
Objectives: The annual proficiency testing of intermediate reference laboratories is conducted by the National Reference Laboratory (NRL) to assess the efficiency of the laboratories to correctly identify Mycobacterium tuberculosis and to determine its drug susceptibility pattern. The proficiency testing results from 2013 to 2017 were analyzed to determine laboratories that were consistent in reporting quality results and those that had difficulty in doing so. Methods: A panel of twenty cultures were sent out to each of these laboratories. The laboratories were expected to grow the cultures in their own laboratories, set up drug susceptibly testing by all the methods they were certified for and report the results within the stipulated time period. The turnaround time for reporting results, specificity, sensitivity positive and negative predictive values and efficiency of the laboratory in identifying the cultures were analyzed. Results: Most of the laboratories had reported their results within the stipulated time period. However, there was enormous delay in reporting results from few of the laboratories. This was mainly due to improper functioning of the biosafety level III laboratory. Only 40% of the laboratories had 100% efficiency in solid culture using Lowenstein Jensen medium. This was expected as a solid culture, and drug susceptibility testing is not used for diagnosing drug resistance. Rapid molecular methods such as Line probe assay and Genexpert are used to determine drug resistance. Automated liquid culture system such as the Mycobacterial growth indicator tube is used to determine prognosis of the patient while on treatment. It was observed that 90% of the laboratories had achieved 100% in the liquid culture method. Almost all laboratories had achieved 100% efficiency in the line probe assay method which is the method of choice for determining drug-resistant tuberculosis. Conclusion: Since the liquid culture and line probe assay technologies are routinely used for the detection of drug-resistant tuberculosis the laboratories exhibited higher level of efficiency as compared to solid culture and drug susceptibility testing which are rarely used. The infrastructure of the laboratory should be maintained properly so that samples can be processed safely and results could be declared on time.Keywords: annual proficiency testing, drug susceptibility testing, intermediate reference laboratory, national reference laboratory
Procedia PDF Downloads 1825237 Developing A Third Degree Of Freedom For Opinion Dynamics Models Using Scales
Authors: Dino Carpentras, Alejandro Dinkelberg, Michael Quayle
Abstract:
Opinion dynamics models use an agent-based modeling approach to model people’s opinions. Model's properties are usually explored by testing the two 'degrees of freedom': the interaction rule and the network topology. The latter defines the connection, and thus the possible interaction, among agents. The interaction rule, instead, determines how agents select each other and update their own opinion. Here we show the existence of the third degree of freedom. This can be used for turning one model into each other or to change the model’s output up to 100% of its initial value. Opinion dynamics models represent the evolution of real-world opinions parsimoniously. Thus, it is fundamental to know how real-world opinion (e.g., supporting a candidate) could be turned into a number. Specifically, we want to know if, by choosing a different opinion-to-number transformation, the model’s dynamics would be preserved. This transformation is typically not addressed in opinion dynamics literature. However, it has already been studied in psychometrics, a branch of psychology. In this field, real-world opinions are converted into numbers using abstract objects called 'scales.' These scales can be converted one into the other, in the same way as we convert meters to feet. Thus, in our work, we analyze how this scale transformation may affect opinion dynamics models. We perform our analysis both using mathematical modeling and validating it via agent-based simulations. To distinguish between scale transformation and measurement error, we first analyze the case of perfect scales (i.e., no error or noise). Here we show that a scale transformation may change the model’s dynamics up to a qualitative level. Meaning that a researcher may reach a totally different conclusion, even using the same dataset just by slightly changing the way data are pre-processed. Indeed, we quantify that this effect may alter the model’s output by 100%. By using two models from the standard literature, we show that a scale transformation can transform one model into the other. This transformation is exact, and it holds for every result. Lastly, we also test the case of using real-world data (i.e., finite precision). We perform this test using a 7-points Likert scale, showing how even a small scale change may result in different predictions or a number of opinion clusters. Because of this, we think that scale transformation should be considered as a third-degree of freedom for opinion dynamics. Indeed, its properties have a strong impact both on theoretical models and for their application to real-world data.Keywords: degrees of freedom, empirical validation, opinion scale, opinion dynamics
Procedia PDF Downloads 1555236 Oil Contents, Mineral Compositions, and Their Correlations in Wild and Cultivated Safflower Seeds
Authors: Rahim Ada, Mustafa Harmankaya, Sadiye Ayse Celik
Abstract:
The safflower seed contains about 25-40% solvent extract and 20-33% fiber. It is well known that dietary phospholipids lower serum cholesterol levels effectively. The nutrient composition of safflower seed changes depending on region, soil and genotypes. This research was made by using of six natural selected (A22, A29, A30, C12, E1, F4, G8, G12, J27) and three commercial (Remzibey, Dincer, Black Sun1) varieties of safflower genotypes. The research was conducted on field conditions for two years (2009 and 2010) in randomized complete block design with three replications in Konya-Turkey ecological conditions. Oil contents, mineral contents and their correlations were determined in the research. According to the results, oil content was ranged from 22.38% to 34.26%, while the minerals were in between the following values: 1469, 04-2068.07 mg kg-1 for Ca, 7.24-11.71 mg kg-1 for B, 13.29-17.41 mg kg-1 for Cu, 51.00-79.35 mg kg-1 for Fe, 3988-6638.34 mg kg-1 for K, 1418.61-2306.06 mg kg-1 for Mg, 11.37-17.76 mg kg-1 for Mn, 4172.33-7059.58 mg kg-1 for P and 32.60-59.00 mg kg-1 for Zn. Correlation analysis that was made separately for the commercial varieties and wild lines showed that high level of oil content was negatively affected by all the investigated minerals except for K and Zn in the commercial varieties.Keywords: safflower, oil, quality, mineral content
Procedia PDF Downloads 2675235 An Effective Decision-Making Strategy Based on Multi-Objective Optimization for Commercial Vehicles in Highway Scenarios
Authors: Weiming Hu, Xu Li, Xiaonan Li, Zhong Xu, Li Yuan, Xuan Dong
Abstract:
Maneuver decision-making plays a critical role in high-performance intelligent driving. This paper proposes a risk assessment-based decision-making network (RADMN) to address the problem of driving strategy for the commercial vehicle. RADMN integrates two networks, aiming at identifying the risk degree of collision and rollover and providing decisions to ensure the effectiveness and reliability of driving strategy. In the risk assessment module, risk degrees of the backward collision, forward collision and rollover are quantified for hazard recognition. In the decision module, a deep reinforcement learning based on multi-objective optimization (DRL-MOO) algorithm is designed, which comprehensively considers the risk degree and motion states of each traffic participant. To evaluate the performance of the proposed framework, Prescan/Simulink joint simulation was conducted in highway scenarios. Experimental results validate the effectiveness and reliability of the proposed RADMN. The output driving strategy can guarantee the safety and provide key technical support for the realization of autonomous driving of commercial vehicles.Keywords: decision-making strategy, risk assessment, multi-objective optimization, commercial vehicle
Procedia PDF Downloads 1355234 Digital Geomatics Trends for Production and Updating Topographic Map by Using Digital Generalization Procedures
Authors: O. Z. Jasim
Abstract:
An accuracy digital map must satisfy the users for two main requirements, first, map must be visually readable and second, all the map elements must be in a good representation. These two requirements hold especially true for map generalization which aims at simplifying the representation of cartographic data. Different scales of maps are very important for any decision in any maps with different scales such as master plan and all the infrastructures maps in civil engineering. Cartographer cannot project the data onto a piece of paper, but he has to worry about its readability. The map layout of any geodatabase is very important, this layout is help to read, analyze or extract information from the map. There are many principles and guidelines of generalization that can be find in the cartographic literature. A manual reduction method for generalization depends on experience of map maker and therefore produces incompatible results. Digital generalization, rooted from conventional cartography, has become an increasing concern in both Geographic Information System (GIS) and mapping fields. This project is intended to review the state of the art of the new technology and help to understand the needs and plans for the implementation of digital generalization capability as well as increase the knowledge of production topographic maps.Keywords: cartography, digital generalization, mapping, GIS
Procedia PDF Downloads 3065233 Loading by Number Strategy for Commercial Vehicles
Authors: Ramalan Musa Yerima
Abstract:
The paper titled “loading by number” explained a strategy developed recently by Zonal Commanding Officer of the Federal Road Safety Corps of Nigeria, covering Sokoto, Kebbi and Zamfara States of Northern Nigeria. The strategy is aimed at reducing competition, which will invariably leads to the reduction in speed, reduction in dangerous driving, reduction in crash rate, reduction in injuries, reduction in property damages and reduction in death through road traffic crashes (RTC). This research paper presents a study focused on enhancing the safety of commercial vehicles. The background of this study highlights the alarming statistics related to commercial vehicle crashes in Nigeria with focus on Sokoto, Kebbi and Zamfara States, which often result in significant damage to property, loss of lives, and economic costs. The significance and aims is to investigate and propose effective strategy to enhance the safety of commercial vehicles. The study recognizes the pressing need for heightened safety measures in commercial transportation, as it impacts not only the well-being of drivers and passengers but also the overall public safety. To achieve the objectives, an examination of accident data, including causes and contributing factors, was performed to identify critical areas for improvement. The major finding of the study reveals that when competition comes into play within the realm of commercial driving, it has detrimental effects on road safety and resource management. Commercial drivers are pushed to complete their routes quickly, deliver goods on time or they pushed themselves to arrive quickly for more passengers and new contracts. This competitive environment, fuelled by internal and external pressures such as tight deadlines, poverty and greed, often leads to sad endings. The study recommend that if a strategy called loading by number is integrated with other multiple safety measures such as driver training programs, regulatory enforcement, and infrastructure improvements, commercial vehicle safety can be significantly enhanced. "Loading by Number” approach is design to ensure that the sequence of departure of drivers from motor park ‘A’ would be communicated to motor park officials of park ‘B’, which would be considered sequentially when giving them returning passengers, regardless of the first to arrive. In conclusion, this paper underscores the significance of improving the safety measures of commercial vehicles, as they are often larger and heavier than other vehicles on the road. Whenever they are involved in accidents, the consequences can be more severe. Commercial vehicles are also frequently involved in long-haul or interstate transportation, which means they cover longer distances and spend more time on the road. This increased exposure to driving conditions increases the probability of accidents occurring. By implementing the suggested measures, policymakers, transportation authorities, and industry stakeholders can work collectively towards ensuring a safer commercial transportation system.Keywords: commercial, safety, strategy, transportation
Procedia PDF Downloads 625232 Defining the Customers' Color Preference for the Apparel Industry in Terms of Chromaticity Coordinates
Authors: Banu Hatice Gürcüm, Pınar Arslan, Mahmut Yalçın
Abstract:
Fashion designers create lots of dresses, suits, shoes, and other clothing and accessories, which are purchased every year by consumers. Fashion trends, sketches of designs, accessories affect the apparel goods, but colors make the finishing touches to an outfit. In all fields of apparel men's, women's, and children's wear, including casual wear, suits, sportswear, formal wear, outerwear, maternity, and intimate apparel, color sells. Thus, specialization in color in apparel is a basic concern each season. The perception of color is the key to sales for every sector in textile business. Mechanism of color perception, cognition in brain and color emotion are unique subjects, which scientists have been investigating for many years. The parameters of color may not be corresponding to visual scales since human emotions induced by color are completely subjective. However, with a very few exception each manufacturer concern their top selling colors for each season through seasonal sales reports of apparel companies. This paper examines sensory and instrumental methods for quantifying color of fabrics and investigates the relationship between fabric color and sale numbers. 5 top selling colors for each season from 10 leading apparel companies in the same segment are taken. The compilation is based according to the sales of the companies for 5 to 10 years. The research’s main concern is the corelation with the magnitude of seasonal color selling figures and the CIE chromaticity coordinates. The colors are chosen from the globally accepted Pantone Textile Color System and the three-dimentional measurement system CIE L*a*b* (CIELAB) is used, L* representing the degree of lightness of color, a* the degree of color ranging from magenta to green, and b* the degree of color ranging from blue to yellow. The objective of this paper is to demonstrate the feasibility of relating color perceptance to a laboratory instrument yielding measurements in the CIELAB system. Our approach is to obtain a total of a hundred reference fabrics to be measured on a laboratory spectrophotometer calibrated to the CIELAB color system. Relationships between the CIE tristimulus (X, Y, Z) and CIELAB (L*, a*, b*) are examined and are reported herein.Keywords: CIELAB, CIE tristimulus, color preference, fashion
Procedia PDF Downloads 3365231 Strategy of Loading by Number for Commercial Vehicles
Authors: Ramalan Musa Yerima
Abstract:
The paper titled “Loading by number” explained a strategy developed recently by the Zonal Commanding Officer of the Federal Road Safety Corps of Nigeria, covering Sokoto, Kebbi and Zamfara States of Northern Nigeria. The strategy is aimed at reducing competition, which will invariably lead to a reduction in speed, reduction in dangerous driving, reduction in crash rate, reduction in injuries, reduction in property damages and reduction in death through road traffic crashes (RTC). This research paper presents a study focused on enhancing the safety of commercial vehicles. The background of this study highlights the alarming statistics related to commercial vehicle crashes in Nigeria with a focus on Sokoto, Kebbi and Zamfara States, which often result in significant damage to property, loss of lives, and economic costs. The significance and aims is to investigate and propose an effective strategy to enhance the safety of commercial vehicles. The study recognizes the pressing need for heightened safety measures in commercial transportation, as it impacts not only the well-being of drivers and passengers but also the overall public safety. To achieve the objectives, an examination of accident data, including causes and contributing factors, was performed to identify critical areas for improvement. The major finding of the study reveals that when competition comes into play within the realm of commercial driving, it has detrimental effects on road safety and resource management. Commercial drivers are pushed to complete their routes quickly and deliver goods on time, or they push themselves to arrive quickly for more passengers and new contracts. This competitive environment, fuelled by internal and external pressures such as tight deadlines, poverty and greed, often leads to sad endings. The study recommends that if a strategy called loading by number is integrated with other multiple safety measures, such as driver training programs, regulatory enforcement, and infrastructure improvements, commercial vehicle safety can be significantly enhanced. "Loading by Number” approach is designed to ensure that the sequence of departure of drivers from the motor park ‘A’ would be communicated to motor park officials of park ‘B’, which would be considered sequentially when giving them returning passengers, regardless of the first to arrive. In conclusion, this paper underscores the significance of improving the safety measures of commercial vehicles, as they are often larger and heavier than other vehicles on the road. Whenever they are involved in accidents, the consequences can be more severe. Commercial vehicles are also frequently involved in long-haul or interstate transportation, which means they cover longer distances and spend more time on the road. This increased exposure to driving conditions increases the probability of accidents occurring. By implementing the suggested measures, policymakers, transportation authorities, and industry stakeholders can work collectively toward ensuring a safer commercial transportation system.Keywords: commercial, safety, strategy, transport
Procedia PDF Downloads 645230 Efficiency of the Slovak Commercial Banks Applying the DEA Window Analysis
Authors: Iveta Řepková
Abstract:
The aim of this paper is to estimate the efficiency of the Slovak commercial banks employing the Data Envelopment Analysis (DEA) window analysis approach during the period 2003-2012. The research is based on unbalanced panel data of the Slovak commercial banks. Undesirable output was included into analysis of banking efficiency. It was found that most efficient banks were Postovabanka, UniCredit Bank and Istrobanka in CCR model and the most efficient banks were Slovenskasporitelna, Istrobanka and UniCredit Bank in BCC model. On contrary, the lowest efficient banks were found Privatbanka and CitiBank. We found that the largest banks in the Slovak banking market were lower efficient than medium-size and small banks. Results of the paper is that during the period 2003-2008 the average efficiency was increasing and then during the period 2010-2011 the average efficiency decreased as a result of financial crisis.Keywords: data envelopment analysis, efficiency, Slovak banking sector, window analysis
Procedia PDF Downloads 3595229 Parking Service Effectiveness at Commercial Malls
Authors: Ahmad AlAbdullah, Ali AlQallaf, Mahdi Hussain, Mohammed AlAttar, Salman Ashknani, Magdy Helal
Abstract:
We study the effectiveness of the parking service provided at Kuwaiti commercial malls and explore potential problems and feasible improvements. Commercial malls are important to Kuwaitis as the entertainment and shopping centers due to the lack of other alternatives. The difficulty and relatively long times wasted in finding a parking spot at the mall are real annoyances. We applied queuing analysis to one of the major malls that offer paid-parking (1040 parking spots) in addition to free parking. Patrons of the mall usually complained of the traffic jams and delays at entering the paid parking (average delay to park exceeds 15 min for about 62% of the patrons, while average time spent in the mall is about 2.6 hours). However, the analysis showed acceptable service levels at the check-in gates of the parking garage. Detailed review of the vehicle movement at the gateways indicated that arriving and departing cars both had to share parts of the gateway to the garage, which caused the traffic jams and delays. A simple comparison we made indicated that the largest commercial mall in Kuwait does not suffer such parking issues, while other smaller, yet important malls do, including the one we studied. It was suggested that well-designed inlets and outlets of that gigantic mall permitted smooth parking despite being totally free and mall is the first choice for most people for entertainment and shopping. A simulation model is being developed for further analysis and verification. Simulation can overcome the mathematical difficulty in using non-Poisson queuing models. The simulation model is used to explore potential changes to the parking garage entrance layout. And with the inclusion of the drivers’ behavior inside the parking, effectiveness indicators can be derived to address the economic feasibility of extending the parking capacity and increasing service levels. Outcomes of the study are planned to be generalized as appropriate to other commercial malls in KuwaitKeywords: commercial malls, parking service, queuing analysis, simulation modeling
Procedia PDF Downloads 3405228 Investigation of the Decisive Factors on the Slump Loss: A Case Study of Cement Factors (Portland Cement Type 2)
Authors: M. B. Ahmadi, A. A. Kaffash B., B. Mobaraki
Abstract:
Slump loss, which refers to the gradual reduction of workability and the amount of slump in fresh concrete over time, is one of the significant challenges in the ready-mixed concrete industry. Therefore, having accurate knowledge of the factors affecting slump loss is a crucial solution in this field. In this paper, an attempt was made to investigate the effect of cement produced by different units on the slump of concrete in a laboratory setting. For this purpose, 12 cement samples were prepared from 6 different production units. Physical and chemical tests were performed on the cement samples. Subsequently, a laboratory concrete mix with a slump of 13 ± 1 cm was prepared with each cement sample, and the slump was measured at 0, 15, 30, 45, and 60 minutes. Although the environmental factors, mix design specifications, and execution conditions—factors that significantly influence the slump loss trend—were constant in all 12 laboratory concrete mixes, the slump loss trends differed among them. These trends were categorized based on the results, and the relationship between the slump loss percentage in 60 minutes, the water-cement ratio, and the LOI and K2O values of different cements were introduced.Keywords: concrete, slump loss, portland cement, efficiency
Procedia PDF Downloads 795227 Assesment of the Economic Potential of Lead Contaminated Brownfield for Growth of Oil Producing Crop Like Helianthus annus (Sunflower)
Authors: Shahenaz Sidi, S. K. Tank
Abstract:
When sparsely used industrial and commercial facilities are retired or abandoned, one of the biggest issues that arise is what to do with the remaining land. This land, referred to as a ‘Brownfield site’ or simply ‘Brownfield’ is often contaminated with waste and pollutants left behind by the defunct industrial facilities and factories that stand on the land. Phytoremediation has been proved a promising greener and cleaner technology in remediating the land unlike other chemical excavation methods. Helianthus annus is a hyper accumulator of lead. Helianthus annus can be used for remediation procedures in metal contaminated soils. It is a fast-growing crop which would favour soil stabilization. Its tough leaves and stems are rarely eaten by animals. The seeds (actively eaten by birds) have very low concentrations of potentially toxic elements, and represent low risk for the food web. The study is conducted to determine the phytoextraction potentials of the plant and the eventual seed harvesting and commercial oil production on remediated soil.Keywords: Brownfield, phytoextraction, helianthus, oil, commercial
Procedia PDF Downloads 3385226 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 3335225 Large Amplitude Vibration of Sandwich Beam
Authors: Youssef Abdelli, Rachid Nasri
Abstract:
The large amplitude free vibration analysis of three-layered symmetric sandwich beams is carried out using two different approaches. The governing nonlinear partial differential equations of motion in free natural vibration are derived using Hamilton's principle. The formulation leads to two nonlinear partial differential equations that are coupled both in axial and binding deformations. In the first approach, the method of multiple scales is applied directly to the governing equation that is a nonlinear partial differential equation. In the second approach, we discretize the governing equation by using Galerkin's procedure and then apply the shooting method to the obtained ordinary differential equations. In order to check the validity of the solutions obtained by the two approaches, they are compared with the solutions obtained by two approaches; they are compared with the solutions obtained numerically by the finite difference method.Keywords: finite difference method, large amplitude vibration, multiple scales, nonlinear vibration
Procedia PDF Downloads 4635224 Forensic Entomology in Algeria
Authors: Meriem Taleb, Ghania Tail, Fatma Zohra Kara, Brahim Djedouani, T. Moussa
Abstract:
Forensic entomology is the use of insects and their arthropod relatives as silent witnesses to aid legal investigations by interpreting information concerning a death. The main purpose of forensic entomology is to establish the postmortem interval or PMI Postmortem interval is a matter of crucial importance in the investigations of homicide and other untimely deaths when the body found is after three days. Forensic entomology has grown immensely as a discipline in the past thirty years. In Algeria, forensic entomology was introduced in 2010 by the National Institute for Criminalistics and Criminology of the National Gendarmerie (NICC). However, all the work that has been done so far in this growing field in Algeria has been unknown at both the national and international levels. In this context, the aim of this paper is to describe the state of forensic entomology in Algeria. The Laboratory of Entomology of the NICC is the only one of its kind in Algeria. It started its activities in 2010, consisting of two specialists. The main missions of the laboratory are estimation of the PMI by the analysis of entomological evidence, and determination if the body was moved. Currently, the laboratory is performing different tasks such as the expert work required by investigators to estimate the PMI using the insects. The estimation is performed by the accumulated degree days method (ADD) in most of the cases except for those where the cadaver is in dry decay. To assure the quality of the entomological evidence, crime scene personnel are trained by the laboratory of Entomology of the NICC. Recently, undergraduate and graduate students have been studying carrion ecology and insect activity in different geographic locations of Algeria using rabbits and wild boar cadavers as animal models. The Laboratory of Entomology of the NICC has also been involved in some of these research projects. Entomotoxicology experiments are also conducted with the collaboration of the Toxicology Department of the NICC. By dint of hard work that has been performed by the Laboratory of Entomology of the NICC, official bodies have been adopting more and more the use of entomological evidence in criminal investigations in Algeria, which is commendable. It is important, therefore, that steps are taken to fill in the gaps in the knowledge necessary for entomological evidence to have a useful future in criminal investigations in Algeria.Keywords: forensic entomology, corpse, insects, postmortem interval, expertise, Algeria
Procedia PDF Downloads 4095223 Permeodynamic Particulate Matter Filtration for Improved Air Quality
Authors: Hamad M. Alnagran, Mohammed S. Imbabi
Abstract:
Particulate matter (PM) in the air we breathe is detrimental to health. Overcoming this problem has attracted interest and prompted research on the use of PM filtration in commercial buildings and homes to be carried out. The consensus is that tangible health benefits can result from the use of PM filters in most urban environments, to clean up the building’s fresh air supply and thereby reduce exposure of residents to airborne PM. The authors have investigated and are developing a new large-scale Permeodynamic Filtration Technology (PFT) capable of permanently filtering and removing airborne PMs from outdoor spaces, thus also benefiting internal spaces such as the interiors of buildings. Theoretical models were developed, and laboratory trials carried out to determine, and validate through measurement permeodynamic filtration efficiency and pressure drop as functions of PM particle size distributions. The conclusion is that PFT offers a potentially viable, cost effective end of pipe solution to the problem of airborne PM.Keywords: air filtration, particulate matter, particle size distribution, permeodynamic
Procedia PDF Downloads 204