Search results for: Process Quality.
5252 Kazakhstani Koreans- Conflict of Linguistic Identity: In–between the Sovietized and Kazakhstani Citizens
Authors: Soon-ok Myong, Byong-soon Chun
Abstract:
This paper intends to identify the ethnic Kazakhstani Koreans- political process of identity formation by exploring their narrative and practice about the state language represented in the course of their becoming the new citizens of a new independent state. The Russophone Kazakhstani Koreans- inability to speak the official language of their affiliated state is considered there as dissatisfying the basic requirement of citizens of the independent state, so that they are becoming marginalized from the public sphere. Their contradictory attitude that at once demonstrates nominal reception and practical rejection of the obligatory state language unveils a high barrier inside between their self-language and other-language. In this paper, the ethnic Korean group-s conflicting linguistic identity is not seen as a free and simple choice, but as a dynamic struggle and political process in which the subject-s past experiences and memories intersect with the external elements of pressure.Keywords: Ethnic Kazakhstani Koreans, Soviet Korean's Russification, Linguistic Identity, Russian-Kazakh Dichotomy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15975251 How Team Efficacy Beliefs Impact Project Performance: An Empirical Investigation of Team Potency in Capital Projects in the Process Industries
Authors: C. Scott-Young, D. Samson
Abstract:
Team efficacy beliefs show promise in enhancing team performance. Using a model-based quantitative research design, we investigated the antecedents and performance consequences of generalized team efficacy (potency) in a sample of 56 capital projects executed by 15 Fortune 500 companies in the process industries. Empirical analysis of our field survey identified that generalized team efficacy beliefs were positively associated with an objective measure of project cost performance. Regression analysis revealed that team competence, empowering leadership, and performance feedback all predicted generalized team efficacy beliefs. Tests of mediation revealed that generalized team efficacy fully mediated between these three inputs and project cost performance.Keywords: Team efficacy, Potency, Leadership, Feedback, Project cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21745250 Determination of Yield and Some Quality Characteristics of Winter Canola (Brassica napus ssp. oleifera L.) Cultivars
Abstract:
Canola is a specific edible type of rapeseed, developed in the 1970s, which contains about 40 percent oil. This research was carried out to determine the yield and some quality characteristics of some winter canola cultivars during the 2010-2011 vegetation period in Central Anatolia of Turkey. In this research; Oase, Dante, Californium, Excalibur, Elvis, ES Hydromel, Licord, Orkan, Vectra, Nelson, Champlain and NK Petrol winter canola varieties were used as material. The field experiment was set up in a “Randomized Complete Block Design” with three replications on 21 September 2010. In this research; seed yield, oil content, protein content, oil yield and protein yield were examined. As a result of this research; seed yield, oil content, oil yield and protein yield (except protein content) were significant differences between the cultivars. The highest seed yield (6348 kg ha-1) was obtained from the NK Petrol, while the lowest seed yield (3949 kg ha-1) was determined from the Champlain cultivar was obtained. The highest oil content (46.73%) was observed from Oase and the lowest value was obtained from Vectra (41.87%) cultivar. The highest oil yield (2950 kg ha-1) was determined from NK Petrol while the least value (1681 kg ha-1) was determined from Champlain cultivar. The highest protein yield (1539.3 kg ha-1) was obtained from NK Petrol and the lowest protein yield (976.5 kg ha-1) was obtained from Champlain cultivar. The main purpose of the cultivation of oil crops, to increase the yield of oil per unit area. According the result of this research, NK Petrol cultivar which ranks first with regard to both seed yield and oil yield between cultivars as the most suitable winter canola cultivar of local conditions.
Keywords: Cultivar, Oil yield, Rapeseed, Seed Yield.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22665249 Selective Encryption using ISMA Cryp in Real Time Video Streaming of H.264/AVC for DVB-H Application
Authors: Jay M. Joshi, Upena D. Dalal
Abstract:
Multimedia information availability has increased dramatically with the advent of video broadcasting on handheld devices. But with this availability comes problems of maintaining the security of information that is displayed in public. ISMA Encryption and Authentication (ISMACryp) is one of the chosen technologies for service protection in DVB-H (Digital Video Broadcasting- Handheld), the TV system for portable handheld devices. The ISMACryp is encoded with H.264/AVC (advanced video coding), while leaving all structural data as it is. Two modes of ISMACryp are available; the CTR mode (Counter type) and CBC mode (Cipher Block Chaining) mode. Both modes of ISMACryp are based on 128- bit AES algorithm. AES algorithms are more complex and require larger time for execution which is not suitable for real time application like live TV. The proposed system aims to gain a deep understanding of video data security on multimedia technologies and to provide security for real time video applications using selective encryption for H.264/AVC. Five level of security proposed in this paper based on the content of NAL unit in Baseline Constrain profile of H.264/AVC. The selective encryption in different levels provides encryption of intra-prediction mode, residue data, inter-prediction mode or motion vectors only. Experimental results shown in this paper described that fifth level which is ISMACryp provide higher level of security with more encryption time and the one level provide lower level of security by encrypting only motion vectors with lower execution time without compromise on compression and quality of visual content. This encryption scheme with compression process with low cost, and keeps the file format unchanged with some direct operations supported. Simulation was being carried out in Matlab.Keywords: AES-128, CAVLC, H.264, ISMACryp
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20575248 Carcass Characteristics and Qualities of Philippine White Mallard (Anas boschas L.) and Pekin (Anas platyrhynchos L.) Duck
Authors: Jerico M. Consolacion, Maria Cynthia R. Oliveros
Abstract:
The Philippine White Mallard duck was compared with Pekin duck for potential meat production. A total of 50 ducklings were randomly assigned to five (5) pens per treatment after one month of brooding. Each pen containing five (5) ducks was considered as a replicate. The ducks were raised until 12 weeks of age and slaughtered at the end of the growing period. Meat from both breeds was analyzed. The data were subjected to the Independent-Sample T-test at 5% level of confidence. Results showed that post-mortem pH (0, 20 minutes, 50 minutes, 1 hour and 20 minutes, 1 hour and 50 minutes, and 24 hours ) did not differ significantly (P>0.05) between breeds. However, Pekin ducks (89.84±0.71) had a significantly higher water-holding capacity than Philippine White Mallard ducks (87.93±0.63) (P<0.05). Also, meat color (CIE L, a, b) revealed that no significant differences among the lightness, redness, and yellowness of the skin (breast) in both breeds (P>0.05) except for the yellowness of the lean muscles of the Pekin duck breast. Pekin duck meat (1.15±0.04) had significantly higher crude fat content than Philippine White Mallard (0.47±0.58). The study clearly showed that breed is a factor and provided some pronounced effects among the parameters. However, these results are considered as preliminary information on the meat quality of Philippine White Mallard duck. Hence, further studies are needed to understand and fully utilize it for meat production and develop different meat products from this breed.Keywords: Crude fat, meat quality, water-holding capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11815247 Recycling of Sclareolide in the Crystallization Mother Liquid of Sclareolide by Adsorption and Chromatography
Authors: Xiang Li, Kui Chen, Bin Wu, Min Zhou
Abstract:
Sclareolide is made from sclareol by oxidiative synthesis and subsequent crystallization, while the crystallization mother liquor still contains 15%~30%wt of sclareolide to be reclaimed. With the reaction material of sclareol is provided as plant extract, many sorts of complex impurities exist in the mother liquor. Due to the difficulty in recycling sclareolide after solvent recovery, it is common practice for the factories to discard the mother liquor, which not only results in loss of sclareolide, but also contributes extra environmental burden. In this paper, a process based on adsorption and elution has been presented for recycling of sclareolide from mother liquor. After pretreatment of the crystallization mother liquor by HZ-845 resin to remove parts of impurities, sclareolide is adsorbed by HZ-816 resin. The HZ-816 resin loaded with sclareolide is then eluted by elution solvent. Finally, the eluent containing sclareolide is concentrated and fed into the crystallization step in the process. By adoption of the recycle from mother liquor, total yield of sclareolide increases from 86% to 90% with a stable purity of the final sclareolide products maintained.
Keywords: Sclareolide, resin, adsorption, chromatography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18445246 Development and Optimization of Automated Dry-Wafer Separation
Authors: Tim Giesen, Christian Fischmann, Fabian Böttinger, Alexander Ehm, Alexander Verl
Abstract:
In a state-of-the-art industrial production line of photovoltaic products the handling and automation processes are of particular importance and implication. While processing a fully functional crystalline solar cell an as-cut photovoltaic wafer is subject to numerous repeated handling steps. With respect to stronger requirements in productivity and decreasing rejections due to defects the mechanical stress on the thin wafers has to be reduced to a minimum as the fragility increases by decreasing wafer thicknesses. In relation to the increasing wafer fragility, researches at the Fraunhofer Institutes IPA and CSP showed a negative correlation between multiple handling processes and the wafer integrity. Recent work therefore focused on the analysis and optimization of the dry wafer stack separation process with compressed air. The achievement of a wafer sensitive process capability and a high production throughput rate is the basic motivation in this research.Keywords: Automation, Photovoltaic Manufacturing, Thin Wafer, Material Handling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16775245 Cost Valuation Method for Development Concurrent Phase Appropriate Requirement Valuation Using the Example of Load Carrier Development in the Lithium-Ion-Battery Production
Authors: Achim Kampker, Christoph Deutskens, Heiner Hans Heimes, Mathias Ordung, Felix Optehostert
Abstract:
In the past years electric mobility became part of a public discussion. The trend to fully electrified vehicles instead of vehicles fueled with fossil energy has notably gained momentum. Today nearly every big car manufacturer produces and sells fully electrified vehicles, but electrified vehicles are still not as competitive as conventional powered vehicles. As the traction battery states the largest cost driver, lowering its price is a crucial objective. In addition to improvements in product and production processes a nonnegligible, but widely underestimated cost driver of production can be found in logistics, since the production technology is not continuous yet and neither are the logistics systems. This paper presents an approach to evaluate cost factors on different designs of load carrier systems. Due to numerous interdependencies, the combination of costs factors for a particular scenario is not transparent. This is effecting actions for cost reduction negatively, but still cost reduction is one of the major goals for simultaneous engineering processes. Therefore a concurrent and phase appropriate cost valuation method is necessary to serve cost transparency. In this paper the four phases of this cost valuation method are defined and explained, which based upon a new approach integrating the logistics development process in to the integrated product and process development.Keywords: Research and development, technology and Innovation, lithium-ion-battery production, load carrier development process, cost valuation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22875244 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems
Authors: Bassam Istanbouli
Abstract:
With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them. In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies; the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system. Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.
Keywords: Blueprint, ERP, SDLC, Modular.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4055243 Exploration of Influential Factors on First Year Architecture Students’ Productivity
Authors: Shima Nikanjam, Badiossadat Hassanpour, Adi Irfan Che Ani
Abstract:
The design process in architecture education is based upon the Learning-by-Doing method, which leads students to understand how to design by practicing rather than studying. First-year design studios, as starting educational stage, provide integrated knowledge and skills of design for newly jointed architecture students. Within the basic design studio environment, students are guided to transfer their abstract thoughts into visual concrete decisions under the supervision of design educators for the first time. Therefore, introductory design studios have predominant impacts on students’ operational thinking and designing. Architectural design thinking is quite different from students’ educational backgrounds and learning habits. This educational challenge at basic design studios creates a severe need to study the reality of design education at foundation year and define appropriate educational methods with convenient project types with the intention of enhancing architecture education quality. Material for this study has been gathered through long-term direct observation at a first year second semester design studio at the faculty of architecture at EMU (known as FARC 102), fall and spring academic semester 2014-15. Distribution of a questionnaire among case study students and interviews with third and fourth design studio students who passed through the same methods of education in the past 2 years and conducting interviews with instructors are other methodologies used in this research. The results of this study reveal a risk of a mismatch between the implemented teaching method, project type and scale in this particular level and students’ learning styles. Although the existence of such risk due to varieties in students’ profiles could be expected to some extent, recommendations can support educators to reach maximum compatibility.
Keywords: Architecture education, basic design studio, educational method, forms creation skill.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16295242 Star-Hexagon Transformer Supported UPQC
Authors: Yash Pal, A.Swarup, Bhim Singh
Abstract:
A new topology of unified power quality conditioner (UPQC) is proposed for different power quality (PQ) improvement in a three-phase four-wire (3P-4W) distribution system. For neutral current mitigation, a star-hexagon transformer is connected in shunt near the load along with three-leg voltage source inverters (VSIs) based UPQC. For the mitigation of source neutral current, the uses of passive elements are advantageous over the active compensation due to ruggedness and less complexity of control. In addition to this, by connecting a star-hexagon transformer for neutral current mitigation the over all rating of the UPQC is reduced. The performance of the proposed topology of 3P-4W UPQC is evaluated for power-factor correction, load balancing, neutral current mitigation and mitigation of voltage and currents harmonics. A simple control algorithm based on Unit Vector Template (UVT) technique is used as a control strategy of UPQC for mitigation of different PQ problems. In this control scheme, the current/voltage control is applied over the fundamental supply currents/voltages instead of fast changing APFs currents/voltages, thereby reducing the computational delay. Moreover, no extra control is required for neutral source current compensation; hence the numbers of current sensors are reduced. The performance of the proposed topology of UPQC is analyzed through simulations results using MATLAB software with its Simulink and Power System Block set toolboxes.Keywords: Power-factor correction, Load balancing, UPQC, Voltage and Current harmonics, Neutral current mitigation, Starhexagon transformer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23385241 A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions
Authors: Mohammad Reza Ghasemi, Ali Ehsani
Abstract:
In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.Keywords: Composite Laminates, GA, Multi-objectiveOptimization, Neural Networks, RBFNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14755240 Properties of the Research Teaching Organization of Military Masters
Authors: A. Bulatbayeva, A. Kusainov, P. Makhanova, A. Bashchikulov
Abstract:
In the article there have been revealed the properties of designing the research teaching the military masters and in the context it has been offered the program of mastering by the masters military men the methodology of research work, in the course of practical teaching activity there has been considered the developed and approbated model of organization of the process of mastering by the masters the methodology of research work. As a whole, the research direction of master preparation leaves its sign to the content of education, forms of organization of educational process, scientific work of masters. In this connection the offered in the article properties of organization of research teaching and a model of organization of mastering by the masters military men the methodology of research work can be taken into account when designing the content of master preparation.Keywords: Masters militaries, Methodology of research work, Military knowledge, Research teaching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12445239 A New Extended Group Mutual Exclusion Algorithm with Low Message Complexity in Distributed Systems
Authors: S. Dehghan, A.M. Rahmani
Abstract:
The group mutual exclusion (GME) problem is an interesting generalization of the mutual exclusion problem. In the group mutual exclusion, multiple processes can enter a critical section simultaneously if they belong to the same group. In the extended group mutual exclusion, each process is a member of multiple groups at the same time. As a result, after the process by selecting a group enter critical section, other processes can select the same group with its belonging group and can enter critical section at the moment, so that it avoids their unnecessary blocking. This paper presents a quorum-based distributed algorithm for the extended group mutual exclusion problem. The message complexity of our algorithm is O(4Q ) in the best case and O(5Q) in the worst case, where Q is a quorum size.Keywords: Group Mutual Exclusion (GME), Extended GME, Distributed systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15355238 Image Ranking to Assist Object Labeling for Training Detection Models
Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman
Abstract:
Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.Keywords: Computer vision, deep learning, object detection, semiconductor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8395237 Hexavalent Chromium Pollution Abatement by use of Scrap Iron
Authors: Marius Gheju, Laura Cocheci
Abstract:
In this study, the reduction of Cr(VI) by use of scrap iron, a cheap and locally available industrial waste, was investigated in continuous system. The greater scrap iron efficiency observed for the first two sections of the column filling indicate that most of the reduction process was carried out in the bottom half of the column filling. This was ascribed to a constant decrease of Cr(VI) concentration inside the filling, as the water front passes from the bottom to the top end of the column. While the bottom section of the column filling was heavily passivated with secondary mineral phases, the top section was less affected by the passivation process; therefore the column filling would likely ensure the reduction of Cr(VI) for time periods longer than 216 hours. The experimental results indicate that fixed beds columns packed with scrap iron could be successfully used for the first step of Cr(VI) polluted wastewater treatment. However, the mass of scrap iron filling should be carefully estimated since it significantly affects the Cr(VI) reduction efficiency.Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18475236 Changes in Amino Acids Content in Muscle of European Eel (Anguilla anguilla) in Relation to Body Size
Authors: L. Gómez-Limia, I. Franco, T. Blanco, S. Martínez
Abstract:
European eels (Anguilla anguilla) belong to Anguilliformes order and Anguillidae family. They are generally classified as warm-water fish. Eels have a great commercial value in Europe and Asian countries. Eels can reach high weights, although their commercial size is relatively low in some countries. The capture of larger eels would facilitate the recovery of the species, as well as having a greater number of either glass eels or elvers for aquaculture. In the last years, the demand and the price of eels have increased significantly. However, European eel is considered critically endangered by the International Union for the Conservation of Nature (IUCN) Red List. The biochemical composition of fishes is an important aspect of quality and affects the nutritional value and consumption quality of fish. In addition, knowing this composition can help predict an individual’s condition for their recovery. Fish is known to be important source of protein rich in essential amino acids. However, there is very little information about changes in amino acids composition of European eels with increase in size. The aim of this study was to evaluate the effect of two different weight categories on the amino acids content in muscle tissue of wild European eels. European eels were caught in River Ulla (Galicia, NW Spain), during winter. The eels were slaughtered in ice water immersion. Then, they were purchased and transferred to the laboratory. The eels were subdivided into two groups, according to the weight. The samples were kept frozen (-20 °C) until their analysis. Frozen eels were defrosted and the white muscle between the head and the anal hole. was extracted, in order to obtain amino acids composition. Thirty eels for each group were used. Liquid chromatography was used for separation and quantification of amino a cids. The results conclude that the eels are rich in glutamic acid, leucine, lysine, threonine, valine, isoleucine and phenylalanine. The analysis showed that there are significant differences (p < 0.05) among the eels with different sizes. Histidine, threonine, lysine, hydroxyproline, serine, glycine, arginine, alanine and proline were higher in small eels. European eels muscle presents between 45 and 46% of essential amino acids in the total amino acids. European eels have a well-balanced and high quality protein source in the respect of E/NE ratio. However, eels with higher weight showed a better ratio of essential and non-essential amino acid.
Keywords: European eels, amino acids, HPLC, body size.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8515235 A Review on Stormwater Harvesting and Reuse
Authors: Fatema Akram, Mohammad G. Rasul, M. Masud K. Khan, M. Sharif I. I. Amir
Abstract:
Australia is a country of some 7,700 million square kilometers with a population of about 22.6 million. At present water security is a major challenge for Australia. In some areas the use of water resources is approaching and in some parts it is exceeding the limits of sustainability. A focal point of proposed national water conservation programs is the recycling of both urban stormwater and treated wastewater. But till now it is not widely practiced in Australia, and particularly stormwater is neglected. In Australia, only 4% of stormwater and rainwater is recycled, whereas less than 1% of reclaimed wastewater is reused within urban areas. Therefore, accurately monitoring, assessing and predicting the availability, quality and use of this precious resource are required for better management. As stormwater is usually of better quality than untreated sewage or industrial discharge, it has better public acceptance for recycling and reuse, particularly for non-potable use such as irrigation, watering lawns, gardens, etc. Existing stormwater recycling practice is far behind of research and no robust technologies developed for this purpose. Therefore, there is a clear need for using modern technologies for assessing feasibility of stormwater harvesting and reuse. Numerical modeling has, in recent times, become a popular tool for doing this job. It includes complex hydrological and hydraulic processes of the study area. The hydrologic model computes stormwater quantity to design the system components, and the hydraulic model helps to route the flow through stormwater infrastructures. Nowadays water quality module is incorporated with these models. Integration of Geographic Information System (GIS) with these models provides extra advantage of managing spatial information. However for the overall management of a stormwater harvesting project, Decision Support System (DSS) plays an important role incorporating database with model and GIS for the proper management of temporal information. Additionally DSS includes evaluation tools and Graphical user interface. This research aims to critically review and discuss all the aspects of stormwater harvesting and reuse such as available guidelines of stormwater harvesting and reuse, public acceptance of water reuse, the scopes and recommendation for future studies. In addition to these, this paper identifies, understand and address the importance of modern technologies capable of proper management of stormwater harvesting and reuse.
Keywords: Stormwater Management, Stormwater Harvesting and Reuse, Numerical Modeling, Geographic Information System (GIS), Decision Support System (DSS), Database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30685234 A Wavelet-Based Watermarking Method Exploiting the Contrast Sensitivity Function
Authors: John N. Ellinas, Panagiotis Kenterlis
Abstract:
The efficiency of an image watermarking technique depends on the preservation of visually significant information. This is attained by embedding the watermark transparently with the maximum possible strength. The current paper presents an approach for still image digital watermarking in which the watermark embedding process employs the wavelet transform and incorporates Human Visual System (HVS) characteristics. The sensitivity of a human observer to contrast with respect to spatial frequency is described by the Contrast Sensitivity Function (CSF). The strength of the watermark within the decomposition subbands, which occupy an interval on the spatial frequencies, is adjusted according to this sensitivity. Moreover, the watermark embedding process is carried over the subband coefficients that lie on edges where distortions are less noticeable. The experimental evaluation of the proposed method shows very good results in terms of robustness and transparency.
Keywords: Image watermarking, wavelet transform, human visual system, contrast sensitivity function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20975233 Informative, Inclusive and Transparent Planning Methods for Sustainable Heritage Management
Authors: Mathilde Kirkegaard
Abstract:
The paper will focus on management of heritage that integrates the local community, and argue towards an obligation to integrate this social aspect in heritage management. By broadening the understanding of heritage, a sustainable heritage management takes its departure in more than a continual conservation of the physicality of heritage. The social aspect, or the local community, is in many govern heritage management situations being overlooked and it is not managed through community based urban planning methods, e.g.: citizen-inclusion, a transparent process, informative and inviting initiatives, etc. Historical sites are often being described by embracing terms such as “ours” and “us”: “our history” and “a history that is part of us”. Heritage is not something static, it is a link between the life that has been lived in the historical frames, and the life that is defining it today. This view on heritage is rooted in the strive to ensure that heritage sites, besides securing the national historical interest, have a value for those people who are affected by it: living in it or visiting it. Antigua Guatemala is a UNESCO-defined heritage site and this site is being ‘threatened’ by tourism, habitation and recreation. In other words: ‘the use’ of the site is considered a threat of the preservation of the heritage. Contradictory the same types of use (tourism and habitation) can also be considered development ability, and perhaps even a sustainable management solution. ‘The use’ of heritage is interlinked with the perspective that heritage sites ought to have a value for people today. In other words, the heritage sites should be comprised of a contemporary substance. Heritage is entwined in its context of physical structures and the social layer. A synergy between the use of heritage and the knowledge about the heritage can generate a sustainable preservation solution. The paper will exemplify this symbiosis with different examples of a heritage management that is centred around a local community inclusion. The inclusive method is not new in architectural planning and it refers to a top-down and bottom-up balance in decision making. It can be endeavoured through designs of an inclusive nature. Catalyst architecture is a planning method that strives to move the process of design solutions into the public space. Through process-orientated designs, or catalyst designs, the community can gain an insight into the process or be invited to participate in the process. A balance between bottom-up and top-down in the development process of a heritage site can, in relation to management measures, be understood to generate a socially sustainable solution. The ownership and engagement that can be created among the local community, along with the use that ultimately can gain an economic benefit, can delegate the maintenance and preservation. Informative, inclusive and transparent planning methods can generate a heritage management that is long-term due to the collective understanding and effort. This method handles sustainable management on two levels: the current preservation necessities and the long-term management, while ensuring a value for people today.
Keywords: Community, intangible, inclusion, planning, heritage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7955232 Visual Construction of Youth in Czechoslovak Press Photographs: 1959-1989
Authors: Jana Teplá
Abstract:
This text focuses on the visual construction of youth in press photographs in socialist Czechoslovakia. It deals with photographs in a magazine for young readers, Mladý svět, published by the Socialist Union of Youth of Czechoslovakia. The aim of this study was to develop a methodological tool for uncovering the values and the ideological messages in the strategies used in the visual construction of reality in the socialist press. Two methods of visual analysis were applied to the photographs, a quantitative content analysis and a social semiotic analysis. The social semiotic analysis focused on images representing youth in their free time. The study shows that the meaning of a socialist press photograph is a result of a struggle for ideological power between formal and informal ideologies. This struggle takes place within the process of production of the photograph and also within the process of interpretation of the photograph.
Keywords: Ideology, press photography, socialist regime, social semiotics, youth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9045231 A Comparison of Single of Decision Tree, Decision Tree Forest and Group Method of Data Handling to Evaluate the Surface Roughness in Machining Process
Authors: S. Ghorbani, N. I. Polushin
Abstract:
The machinability of workpieces (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron) in turning operation has been carried out using different types of cutting tool (conventional, cutting tool with holes in toolholder and cutting tool filled up with composite material) under dry conditions on a turning machine at different stages of spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). Experimentation was performed as per Taguchi’s orthogonal array. To evaluate the relative importance of factors affecting surface roughness the single decision tree (SDT), Decision tree forest (DTF) and Group method of data handling (GMDH) were applied.
Keywords: Decision Tree Forest, GMDH, surface roughness, taguchi method, turning process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9645230 Biological Soil Conservation Planning by Spatial Multi-Criteria Evaluation Techniques (Case Study: Bonkuh Watershed in Iran)
Authors: Ali Akbar Jamali
Abstract:
This paper discusses site selection process for biological soil conservation planning. It was supported by a valuefocused approach and spatial multi-criteria evaluation techniques. A first set of spatial criteria was used to design a number of potential sites. Next, a new set of spatial and non-spatial criteria was employed, including the natural factors and the financial costs, together with the degree of suitability for the Bonkuh watershed to biological soil conservation planning and to recommend the most acceptable program. The whole process was facilitated by a new software tool that supports spatial multiple criteria evaluation, or SMCE in GIS software (ILWIS). The application of this tool, combined with a continual feedback by the public attentions, has provided an effective methodology to solve complex decisional problem in biological soil conservation planning.Keywords: GIS, Biological soil conservation planning, Spatial multi-criteria evaluation, Iran
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17265229 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9015228 An Overview of Project Management Application in Computational Fluid Dynamics
Authors: Sajith Sajeev
Abstract:
The application of Computational Fluid Dynamics (CFD) is widespread in engineering and industry, including aerospace, automotive, and energy. CFD simulations necessitate the use of intricate mathematical models and a substantial amount of computational power to accurately describe the behavior of fluids. The implementation of CFD projects can be difficult, and a well-structured approach to project management is required to assure the timely and cost-effective delivery of high-quality results. This paper's objective is to provide an overview of project management in CFD, including its problems, methodologies, and best practices. The study opens with a discussion of the difficulties connected with CFD project management, such as the complexity of the mathematical models, the need for extensive computational resources, and the difficulties associated with validating and verifying the results. In addition, the study examines the project management methodologies typically employed in CFD, such as the Traditional/Waterfall model, Agile and Scrum. Comparisons are made between the advantages and disadvantages of each technique, and suggestions are made for their effective implementation in CFD projects. The study concludes with a discussion of the best practices for project management in CFD, including the utilization of a well-defined project scope, a clear project plan, and effective teamwork. In addition, it highlights the significance of continuous process improvement and the utilization of metrics to monitor progress and discover improvement opportunities. This article is a resource for project managers, researchers, and practitioners in the field of CFD. It can aid in enhancing project outcomes, reducing risks, and enhancing the productivity of CFD projects. This paper provides a complete overview of project management in CFD and is a great resource for individuals who wish to implement efficient project management methods in CFD projects.
Keywords: Project management, Computational Fluid Dynamics, Traditional/Waterfall methodology, agile methodology, scrum methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8155227 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework
Authors: J. Grira, Y. Bédard, S. Roche
Abstract:
The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16725226 Mathematical Models for Overall Gas Transfer Coefficient Using Different Theories and Evaluating Their Measurement Accuracy
Authors: Shashank.B. Thakre, Lalit.B. Bhuyar, Samir.J. Deshmukh
Abstract:
Oxygen transfer, the process by which oxygen is transferred from the gaseous to liquid phase, is a vital part of the waste water treatment process. Because of low solubility of oxygen and consequent low rate of oxygen transfer, sufficient oxygen to meet the requirement of aerobic waste does not enter through normal surface air water interface. Many theories have come up in explaining the mechanism of gas transfer and absorption of non-reacting gases in a liquid, of out of which, Two film theory is important. An exiting mathematical model determines approximate value of Overall Gas Transfer coefficient. The Overall Gas Transfer coefficient, in case of Penetration theory, is 1.13 time more than that obtained in case of Two film theory. The difference is due to the difference in assumptions in the two theories. The paper aims at development of mathematical model which determines the value of Overall Gas Transfer coefficient with greater accuracy than the existing model.Keywords: Theories, Dissolved oxygen, Mathematical model, Gas Transfer coefficient, Accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15705225 The Effects of Shot and Grit Blasting Process Parameters on Steel Pipes Coating Adhesion
Authors: Saeed Khorasanizadeh
Abstract:
Adhesion strength of exterior or interior coating of steel pipes is too important. Increasing of coating adhesion on surfaces can increase the life time of coating, safety factor of transmitting line pipe and decreasing the rate of corrosion and costs. Preparation of steel pipe surfaces before doing the coating process is done by shot and grit blasting. This is a mechanical way to do it. Some effective parameters on that process, are particle size of abrasives, distance to surface, rate of abrasive flow, abrasive physical properties, shapes, selection of abrasive, kind of machine and its power, standard of surface cleanness degree, roughness, time of blasting and weather humidity. This search intended to find some better conditions which improve the surface preparation, adhesion strength and corrosion resistance of coating. So, this paper has studied the effect of varying abrasive flow rate, changing the abrasive particle size, time of surface blasting on steel surface roughness and over blasting on it by using the centrifugal blasting machine. After preparation of numbers of steel samples (according to API 5L X52) and applying epoxy powder coating on them, to compare strength adhesion of coating by Pull-Off test. The results have shown that, increasing the abrasive particles size and flow rate, can increase the steel surface roughness and coating adhesion strength but increasing the blasting time can do surface over blasting and increasing surface temperature and hardness too, change, decreasing steel surface roughness and coating adhesion strength.Keywords: surface preparation, abrasive particles, adhesionstrength
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 90855224 A Bayesian Classification System for Facilitating an Institutional Risk Profile Definition
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for easy creation and classification of institutional risk profiles supporting endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support set up of the most important risk factors. Subsequently, risk profiles employ risk factors classifier and associated configurations to support digital preservation experts with a semi-automatic estimation of endangerment group for file format risk profiles. Our goal is to make use of an expert knowledge base, accuired through a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation of risk factors for a requried dimension for analysis. Using the naive Bayes method, the decision support system recommends to an expert the matching risk profile group for the previously selected institutional risk profile. The proposed methods improve the visibility of risk factor values and the quality of a digital preservation process. The presented approach is designed to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and values of file format risk profiles. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert and to define its profile group. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.Keywords: linked open data, information integration, digital libraries, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7405223 A New Analytical Approach to Reconstruct Residual Stresses Due to Turning Process
Authors: G.H. Farrahi, S.A. Faghidian, D.J. Smith
Abstract:
A thin layer on the component surface can be found with high tensile residual stresses, due to turning operations, which can dangerously affect the fatigue performance of the component. In this paper an analytical approach is presented to reconstruct the residual stress field from a limited incomplete set of measurements. Airy stress function is used as the primary unknown to directly solve the equilibrium equations and satisfying the boundary conditions. In this new method there exists the flexibility to impose the physical conditions that govern the behavior of residual stress to achieve a meaningful complete stress field. The analysis is also coupled to a least squares approximation and a regularization method to provide stability of the inverse problem. The power of this new method is then demonstrated by analyzing some experimental measurements and achieving a good agreement between the model prediction and the results obtained from residual stress measurement.Keywords: Residual stress, Limited measurements, Inverse problems, Turning process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441