Search results for: internationalization process
13319 A Goal-Driven Crime Scripting Framework
Authors: Hashem Dehghanniri
Abstract:
Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.Keywords: attack modelling, crime commission process, crime script, situational crime prevention
Procedia PDF Downloads 12813318 Neighborhood Sustainability Assessment Tools: A Conceptual Framework for Their Use in Building Adaptive Capacity to Climate Change
Authors: Sally Naji, Julie Gwilliam
Abstract:
Climate change remains a challenging matter for the human and the built environment in the 21st century, where the need to consider adaptation to climate change in the development process is paramount. However, there remains a lack of information regarding how we should prepare responses to this issue, such as through developing organized and sophisticated tools enabling the adaptation process. This study aims to build a systematic framework approach to investigate the potentials that Neighborhood Sustainability Assessment tools (NSA) might offer in enabling both the analysis of the emerging adaptive capacity to climate change. The analysis of the framework presented in this paper aims to discuss this issue in three main phases. The first part attempts to link sustainability and climate change, in the context of adaptive capacity. It is argued that in deciding to promote sustainability in the context of climate change, both the resilience and vulnerability processes become central. However, there is still a gap in the current literature regarding how the sustainable development process can respond to climate change. As well as how the resilience of practical strategies might be evaluated. It is suggested that the integration of the sustainability assessment processes with both the resilience thinking process, and vulnerability might provide important components for addressing the adaptive capacity to climate change. A critical review of existing literature is presented illustrating the current lack of work in this field, integrating these three concepts in the context of addressing the adaptive capacity to climate change. The second part aims to identify the most appropriate scale at which to address the built environment for the climate change adaptation. It is suggested that the neighborhood scale can be considered as more suitable than either the building or urban scales. It then presents the example of NSAs, and discusses the need to explore their potential role in promoting the adaptive capacity to climate change. The third part of the framework presents a comparison among three example NSAs, BREEAM Communities, LEED-ND, and CASBEE-UD. These three tools have been selected as the most developed and comprehensive assessment tools that are currently available for the neighborhood scale. This study concludes that NSAs are likely to present the basis for an organized framework to address the practical process for analyzing and yet promoting Adaptive Capacity to Climate Change. It is further argued that vulnerability (exposure & sensitivity) and resilience (Interdependence & Recovery) form essential aspects to be addressed in the future assessment of NSA’s capability to adapt to both short and long term climate change impacts. Finally, it is acknowledged that further work is now required to understand impact assessment in terms of the range of physical sectors (Water, Energy, Transportation, Building, Land Use and Ecosystems), Actor and stakeholder engagement as well as a detailed evaluation of the NSA indicators, together with a barriers diagnosis process.Keywords: adaptive capacity, climate change, NSA tools, resilience, sustainability
Procedia PDF Downloads 38113317 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills
Authors: Kyle De Freitas, Margaret Bernard
Abstract:
Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.Keywords: educational data mining, learning management system, learning analytics, EDM framework
Procedia PDF Downloads 32713316 Valorization of Natural Vegetable Substances from Tunisia: Purification of Two Food Additives, Anthocyanins and Locust Bean Gum
Authors: N. Bouzouita, A. Snoussi , H. Ben Haj Koubaier, I. Essaidi, M. M. Chaabouni, S. Zgoulli, P. Thonart
Abstract:
Color is one of the most important quality attributes for the food industry. Grape marc, a complex lignocellulosic material is one of the most abundant and worth less byproduct, generated after the pressing process. The development of the process of purification by micro filtration, ultra filtration, nano filtration and drying by atomization of the anthocyanins of Tunisian origin is the aim of this work. Locust bean gum is the ground endosperm of the seeds of carob fruit; owing to its remarkable water-binding properties, it is widely used to improve the texture of food and largely employed in food industry. The purification of LGB causes drastically reduced ash and proteins contents but important increase for galactomannan.Keywords: Carob, food additives, grape pomace, locust bean gum, natural colorant, nano filtration, thickener, ultra filtration
Procedia PDF Downloads 33613315 The Development and Validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers
Authors: Ian Phil Canlas, Mageswary Karpudewan, Joyce Magtolis, Rosario Canlas
Abstract:
This study reported the development and validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers (ADRRQT). The questionnaire is a combination of Likert scale and open-ended questions that were grouped into two parts. The first part included questions relating to the general awareness on disaster risk reduction. Whereas, the second part comprised questions regarding the integration of disaster risk reduction in the teaching process. The entire process of developing and validating of the ADRRQT was described in this study. Statistical and qualitative findings revealed that the ADRRQT is significantly valid and reliable and has the potential of measuring awareness to disaster risk reduction of stakeholders in the field of teaching. Moreover, it also shows the potential to be adopted in other fields.Keywords: awareness, development, disaster risk reduction, questionnaire, validation
Procedia PDF Downloads 22913314 Using Single Decision Tree to Assess the Impact of Cutting Conditions on Vibration
Authors: S. Ghorbani, N. I. Polushin
Abstract:
Vibration during machining process is crucial since it affects cutting tool, machine, and workpiece leading to a tool wear, tool breakage, and an unacceptable surface roughness. This paper applies a nonparametric statistical method, single decision tree (SDT), to identify factors affecting on vibration in machining process. Workpiece material (AISI 1045 Steel, AA2024 Aluminum alloy, A48-class30 Gray Cast Iron), cutting tool (conventional, cutting tool with holes in toolholder, cutting tool filled up with epoxy-granite), tool overhang (41-65 mm), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev) and depth of cut (0.05-0.15 mm) were used as input variables, while vibration was the output parameter. It is concluded that workpiece material is the most important parameters for natural frequency followed by cutting tool and overhang.Keywords: cutting condition, vibration, natural frequency, decision tree, CART algorithm
Procedia PDF Downloads 33813313 Application Quality Function Deployment (QFD) Tool in Design of Aero Pumps Based on System Engineering
Authors: Z. Soleymani, M. Amirzadeh
Abstract:
Quality Function Deployment (QFD) was developed in 1960 in Japan and introduced in 1983 in America and Europe. The paper presents a real application of this technique in a way that the method of applying QFD in design and production aero fuel pumps has been considered. While designing a product and in order to apply system engineering process, the first step is identification customer needs then its transition to engineering parameters. Since each change in deign after production process leads to extra human costs and also increase in products quality risk, QFD can make benefits in sale by meeting customer expectations. Since the needs identified as well, the use of QFD tool can lead to increase in communications and less deviation in design and production phases, finally it leads to produce the products with defined technical attributes.Keywords: customer voice, engineering parameters, gear pump, QFD
Procedia PDF Downloads 25113312 Interpretation of Heritage Revitalization
Authors: Jarot Mahendra
Abstract:
The primary objective of this paper is to provide a view in the interpretation of the revitalization of heritage buildings. This objective is achieved by analyzing the concept of interpretation that is oriented in the perspective of law, urban spatial planning, and stakeholder perspective, and then develops the theoretical framework of interpretation in the cultural resources management through issues of identity, heritage as a process, and authenticity in heritage. The revitalization of heritage buildings with the interpretation of these three issues is that interpretation can be used as a communication process to express the meaning and relation of heritage to the community so as to avoid the conflict that will arise and develop as a result of different perspectives of stakeholders. Using case studies in Indonesia, this study focuses on the revitalization of heritage sites in the National Gallery of Indonesia (GNI). GNI is a cultural institution that uses several historical buildings that have been designated as heritage and have not been designated as a heritage according to the regulations applicable in Indonesia, in carrying out its function as the center of Indonesian art development and art museums. The revitalization of heritage buildings is taken as a step to meet space needs in running the current GNI function. In the revitalization master plan, there are physical interventions on the building of heritage and the removal of some historic buildings which will then be built new buildings at that location. The research matrix was used to map out the main elements of the study (the concept of GNI revitalization, heritage as identity, heritage as a process, and authenticity in the heritage). Expert interviews and document studies are the main tools used in collecting data. Qualitative data is then analyzed through content analysis and template analysis. This study identifies the significance of historic buildings (heritage buildings and buildings not defined as heritage) as an important value of history, architecture, education, and culture. The significance becomes the basis for revisiting the revitalization master plan which is then reviewed according to applicable regulations and the spatial layout of Jakarta. The interpretation that is built is (1) GNI is one of the elements of the embodiment of the National Cultural Center in the context of the region, where there are National Monument, National Museum and National Library in the same area, so the heritage not only gives identity to the past culture but the culture of current community; (2) The heritage should be seen as a dynamic cultural process towards the cultural change of community, where heritage must develop along with the urban development, so that the heritage buildings can remain alive and side by side with modern buildings but still observe the principles of preservation of heritage; (3) The authenticity of heritage should be able to balance the cultural heritage conservation approach with urban development, where authenticity can serve as a 'Value Transmitter' so that authenticity can be used to evaluate, preserve and manage heritage buildings by considering tangible and intangible aspects.Keywords: authenticity, culture process, identity, interpretation, revitalization
Procedia PDF Downloads 15013311 Monitoring the Drying and Grinding Process during Production of Celitement through a NIR-Spectroscopy Based Approach
Authors: Carolin Lutz, Jörg Matthes, Patrick Waibel, Ulrich Precht, Krassimir Garbev, Günter Beuchle, Uwe Schweike, Peter Stemmermann, Hubert B. Keller
Abstract:
Online measurement of the product quality is a challenging task in cement production, especially in the production of Celitement, a novel environmentally friendly hydraulic binder. The mineralogy and chemical composition of clinker in ordinary Portland cement production is measured by X-ray diffraction (XRD) and X ray fluorescence (XRF), where only crystalline constituents can be detected. But only a small part of the Celitement components can be measured via XRD, because most constituents have an amorphous structure. This paper describes the development of algorithms suitable for an on-line monitoring of the final processing step of Celitement based on NIR-data. For calibration intermediate products were dried at different temperatures and ground for variable durations. The products were analyzed using XRD and thermogravimetric analyses together with NIR-spectroscopy to investigate the dependency between the drying and the milling processes on one and the NIR-signal on the other side. As a result, different characteristic parameters have been defined. A short overview of the Celitement process and the challenging tasks of the online measurement and evaluation of the product quality will be presented. Subsequently, methods for systematic development of near-infrared calibration models and the determination of the final calibration model will be introduced. The application of the model on experimental data illustrates that NIR-spectroscopy allows for a quick and sufficiently exact determination of crucial process parameters.Keywords: calibration model, celitement, cementitious material, NIR spectroscopy
Procedia PDF Downloads 50213310 Innovation in Information Technology Services: Framework to Improve the Effectiveness and Efficiency of Information Technology Service Management Processes, Projects and Decision Support Management
Authors: Pablo Cardozo Herrera
Abstract:
In a dynamic market of Information Technology (IT) Service and with high quality demands and high performance requirements in decreasing costs, it is imperative that IT companies invest organizational effort in order to increase the effectiveness of their Information Technology Service Management (ITSM) processes through the improvement of ITSM project management and through solid support to the strategic decision-making process of IT directors. In this article, the author presents an analysis of common issues of IT companies around the world, with strategic needs of information unmet that provoke their ITSM processes and projects management that do not achieve the effectiveness and efficiency expected of their results. In response to the issues raised, the author proposes a framework consisting of an innovative theoretical framework model of ITSM management and a technological solution aligned to the Information Technology Infrastructure Library (ITIL) good practices guidance and ISO/IEC 20000-1 requirements. The article describes a research that proves the proposed framework is able to integrate, manage and coordinate in a holistic way, measurable and auditable, all ITSM processes and projects of IT organization and utilize the effectiveness assessment achieved for their strategic decision-making process increasing the process maturity level and improving the capacity of an efficient management.Keywords: innovation in IT services, ITSM processes, ITIL and ISO/IEC 20000-1, IT service management, IT service excellence
Procedia PDF Downloads 39813309 Modeling of Age Hardening Process Using Adaptive Neuro-Fuzzy Inference System: Results from Aluminum Alloy A356/Cow Horn Particulate Composite
Authors: Chidozie C. Nwobi-Okoye, Basil Q. Ochieze, Stanley Okiy
Abstract:
This research reports on the modeling of age hardening process using adaptive neuro-fuzzy inference system (ANFIS). The age hardening output (Hardness) was predicted using ANFIS. The input parameters were ageing time, temperature and percentage composition of cow horn particles (CHp%). The results show the correlation coefficient (R) of the predicted hardness values versus the measured values was of 0.9985. Subsequently, values outside the experimental data points were predicted. When the temperature was kept constant, and other input parameters were varied, the average relative error of the predicted values was 0.0931%. When the temperature was varied, and other input parameters kept constant, the average relative error of the hardness values predictions was 80%. The results show that ANFIS with coarse experimental data points for learning is not very effective in predicting process outputs in the age hardening operation of A356 alloy/CHp particulate composite. The fine experimental data requirements by ANFIS make it more expensive in modeling and optimization of age hardening operations of A356 alloy/CHp particulate composite.Keywords: adaptive neuro-fuzzy inference system (ANFIS), age hardening, aluminum alloy, metal matrix composite
Procedia PDF Downloads 15513308 An AK-Chart for the Non-Normal Data
Authors: Chia-Hau Liu, Tai-Yue Wang
Abstract:
Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.Keywords: multivariate control chart, statistical process control, one-class classification method, non-normal data
Procedia PDF Downloads 42313307 Efficiency of a Semantic Approach in Teaching Foreign Languages
Authors: Genady Shlomper
Abstract:
During the process of language teaching, each teacher faces some general and some specific problems. Some of these problems are mutual to all languages because they yield to the rules of cognition, conscience, perception, understanding and memory; to the physiological and psychological principles pertaining to the human race irrespective of origin and nationality. Still, every language is a distinctive system, possessing individual properties and an obvious identity, as a result of a development in specific natural, geographical, cultural and historical conditions. The individual properties emerge in the script, in the phonetics, morphology and syntax. All these problems can and should be a subject of a detailed research and scientific analysis, mainly from practical considerations and language teaching requirements. There are some formidable obstacles in the language acquisition process. Among the first to be mentioned is the existence of concepts and entire categories in foreign languages, which are absent in the language of the students. Such phenomena reflect specific ways of thinking and the world-outlook, which were shaped during the evolution. Hindi is the national language of India, which belongs to the group of Indo-Iranian languages from the Indo-European family of languages. The lecturer has gained experience in teaching Hindi language to native speakers of Uzbek, Russian and Hebrew languages. He will show the difficulties in the field of phonetics, morphology and syntax, which the students have to deal with during the acquisition of the language. In the proposed lecture the lecturer will share his experience in making the process of language teaching more efficient by using non-formal semantic approach.Keywords: applied linguistics, foreign language teaching, language teaching methodology, semantics
Procedia PDF Downloads 35713306 Multivariate Simulations of the Process of Forming the Automotive Connector Forging from ZK60 Alloy
Authors: Anna Dziubinska
Abstract:
The article presents the results of numerical simulations of the new forging process of the automotive connector forging from cast preform. The high-strength ZK60 alloy (belonging to the Mg-Zn-Zr group of Mg alloys) was selected for numerical tests. Currently, this part of the industry is produced by multi-stage forging consisting of operations: bending, preforming, and finishing. The use of the cast preform would enable forging this component in one operation. However, obtaining specific mechanical properties requires inducing a certain level of strain within the forged part. Therefore, the design of the preform, its shape, and volume are of paramount importance. In work presented in this article, preforms of different shapes were designed and assessed using Finite Element (FE) analysis. The research was funded by the Polish National Agency for Academic Exchange within the framework of the Bekker programme.Keywords: automotive connector, forging, magnesium alloy, numerical simulation, preform, ZK60
Procedia PDF Downloads 13413305 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System
Authors: Imran Dayan, Ashiqul Khan
Abstract:
Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining
Procedia PDF Downloads 33713304 Prediction of Compressive Strength of Concrete from Early Age Test Result Using Design of Experiments (Rsm)
Authors: Salem Alsanusi, Loubna Bentaher
Abstract:
Response Surface Methods (RSM) provide statistically validated predictive models that can then be manipulated for finding optimal process configurations. Variation transmitted to responses from poorly controlled process factors can be accounted for by the mathematical technique of propagation of error (POE), which facilitates ‘finding the flats’ on the surfaces generated by RSM. The dual response approach to RSM captures the standard deviation of the output as well as the average. It accounts for unknown sources of variation. Dual response plus propagation of error (POE) provides a more useful model of overall response variation. In our case, we implemented this technique in predicting compressive strength of concrete of 28 days in age. Since 28 days is quite time consuming, while it is important to ensure the quality control process. This paper investigates the potential of using design of experiments (DOE-RSM) to predict the compressive strength of concrete at 28th day. Data used for this study was carried out from experiment schemes at university of Benghazi, civil engineering department. A total of 114 sets of data were implemented. ACI mix design method was utilized for the mix design. No admixtures were used, only the main concrete mix constituents such as cement, coarse-aggregate, fine aggregate and water were utilized in all mixes. Different mix proportions of the ingredients and different water cement ratio were used. The proposed mathematical models are capable of predicting the required concrete compressive strength of concrete from early ages.Keywords: mix proportioning, response surface methodology, compressive strength, optimal design
Procedia PDF Downloads 26913303 Scaling-Down an Agricultural Waste Biogas Plant Fermenter
Authors: Matheus Pessoa, Matthias Kraume
Abstract:
Scale-Down rules in process engineering help us to improve and develop Industrial scale parameters into lab scale. Several scale-down rules available in the literature like Impeller Power Number, Agitation device Power Input, Substrate Tip Speed, Reynolds Number and Cavern Development were investigated in order to stipulate the rotational speed to operate an 11 L working volume lab-scale bioreactor within industrial process parameters. Herein, xanthan gum was used as a fluid with a representative viscosity of a hypothetical biogas plant, with H/D = 1 and central agitation, fermentation broth using sewage sludge and sugar beet pulp as substrate. The results showed that the cavern development strategy was the best method for establishing a rotational speed for the bioreactor operation, while the other rules presented values out of reality for this article proposes.Keywords: anaerobic digestion, cavern development, scale down rules, xanthan gum
Procedia PDF Downloads 49413302 A Multi-criteria Decision Method For The Recruitment Of Academic Personnel Based On The Analytical Hierarchy Process And The Delphi Method In A Neutrosophic Environment (Full Text)
Authors: Antonios Paraskevas, Michael Madas
Abstract:
For a university to maintain its international competitiveness in education, it is essential to recruit qualitative academic staff as it constitutes its most valuable asset. This selection demonstrates a significant role in achieving strategic objectives, particularly by emphasizing a firm commitment to exceptional student experience and innovative teaching and learning practices of high quality. In this vein, the appropriate selection of academic staff establishes a very important factor of competitiveness, efficiency and reputation of an academic institute. Within this framework, our work demonstrates a comprehensive methodological concept that emphasizes on the multi-criteria nature of the problem and on how decision makers could utilize our approach in order to proceed to the appropriate judgment. The conceptual framework introduced in this paper is built upon a hybrid neutrosophic method based on the Neutrosophic Analytical Hierarchy Process (N-AHP), which uses the theory of neutrosophy sets and is considered suitable in terms of significant degree of ambiguity and indeterminacy observed in decision-making process. To this end, our framework extends the N-AHP by incorporating the Neutrosophic Delphi Method (N-DM). By applying the N-DM, we can take into consideration the importance of each decision-maker and their preferences per evaluation criterion. To the best of our knowledge, the proposed model is the first which applies Neutrosophic Delphi Method in the selection of academic staff. As a case study, it was decided to use our method to a real problem of academic personnel selection, having as main goal to enhance the algorithm proposed in previous scholars’ work, and thus taking care of the inherit ineffectiveness which becomes apparent in traditional multi-criteria decision-making methods when dealing with situations alike. As a further result, we prove that our method demonstrates greater applicability and reliability when compared to other decision models.Keywords: analytical hierarchy process, delphi method, multi-criteria decision maiking method, neutrosophic set theory, personnel recruitment
Procedia PDF Downloads 20213301 Ideological Manipulations and Cultural-Norm Constraints
Authors: Masoud Hassanzade Novin, Bahloul Salmani
Abstract:
Translation cannot be considered as a simple linguistic act. Through the rise of descriptive approach in the late 1970s and 1980s, translation process managed to meet the requirements of social aspects as well as linguistic approaches. To have the translation considered as the cross-cultural communication through which various cultures communicate in ideological and cultural constraints, the contrastive analysis was conducted in this paper to reveal the distortions imposed in the translated texts. The corpus of the study involved the novel 1984 written by George Orwell and its Persian translated texts which were analyzed through the qualitative type of the research based on critical discourse analysis (CDA) and Toury's norms as well as Lefever's concepts of ideology. Results of the study revealed the point that ideology and the cultural constraints were considered as an important stimulus which can control the process of the translation.Keywords: critical discourse analysis, ideology, norms, translated texts
Procedia PDF Downloads 33713300 High Pressure Delignification Process for Nanocrystalline Cellulose Production from Agro-Waste Biomass
Authors: Sakinul Islam, Nhol Kao, Sati Bhattacharya, Rahul Gupta
Abstract:
Nanocrystalline cellulose (NCC) has been widely used for miscellaneous applications due to its superior properties over other nanomaterials. However, the major problems associated with the production of NCC are long reaction time, low production rate and inefficient process. The mass production of NCC within a short period of time is still a great challenge. The main objective of this study is to produce NCC from rice husk agro waste biomass from a high pressure delignification process (HPDP), followed by bleaching and hydrolysis processes. The HPDP has not been explored for NCC production from rice husk biomass (RHB) until now. In order to produce NCC, powder rice husk (PRH) was placed into a stainless steel reactor at 80 ˚C under 5 bars. Aqueous solution of NaOH (4M) was used for the dissolution of lignin and other amorphous impurities from PRH. After certain experimental times (1h, 3.5h and 6h), bleaching and hydrolysis were carried out on delignified samples. NaOCl (20%) and H2SO4 (4M) solutions were used for bleaching and hydrolysis processes, respectively. The NCC suspension from hydrolysis was sonicated and neutralized by buffer solution for various characterisations. Finally NCC suspension was dried and analyzed by FTIR, XRD, SEM, AFM and TEM. The chemical composition of NCC and PRH was estimated by TAPPI (Technical Association of Pulp and Paper Industry) standard methods to observe the product purity. It was found that, the 6h of the HPDP was more efficient to produce good quality NCC than that at 1h and 3.5h due to low separation of non-cellulosic components from RHB. The analyses indicated the crystallinity of NCC to be 71 %, particle size of 20-50 nm (diameter) and 100-200 nm in length.Keywords: nanocrystalline cellulose, NCC, high pressure delignification, bleaching, hydrolysis, agro-waste biomass
Procedia PDF Downloads 26513299 Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings
Authors: Sorin Valcan, Mihail Gaianu
Abstract:
Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need for labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to an algorithm used for the generation of ground truth data for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher, which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual label adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks.Keywords: labeling automation, infrared camera, driver monitoring, eye detection, convolutional neural networks
Procedia PDF Downloads 11913298 Leaching of Copper from Copper Ore Using Sulphuric Acid in the Presence of Hydrogen Peroxide as an Oxidizing Agent: An Optimized Process
Authors: Hilary Rutto
Abstract:
Leaching with acids are the most commonly reagents used to remove copper ions from its copper ores. It is important that the process conditions are optimized to improve the leaching efficiency. In the present study the effects of pH, oxidizing agent (hydrogen peroxide), stirring speed, solid to liquid ratio and acid concentration on the leaching of copper ions from it ore were investigated using a pH Stat apparatus. Copper ions were analyzed at the end of each experiment using Atomic Absorption (AAS) machine. Results showed that leaching efficiency improved with an increase in acid concentration, stirring speed, oxidizing agent, pH and decreased with an increase in the solid to liquid ratio.Keywords: leaching, copper, oxidizing agent, pH stat apparatus
Procedia PDF Downloads 37813297 Design of the Fiber Lay-Up for the Composite Wind Turbine Blade in VARTM
Authors: Tzai-Shiung Li, Wen-Bin Young
Abstract:
The wind turbine blade sustains various kinds of loadings during the operating and parking state. Due to the increasing size of the wind turbine blade, it is important to arrange the composite materials in a sufficient way to reach the optimal utilization of the material strength. In the fabrication process of the vacuum assisted resin transfer molding, the fiber content of the turbine blade depends on the vacuum pressure. In this study, a design of the fiber layup for the vacuum assisted resin transfer molding is conducted to achieve the efficient utilization the material strength. This design is for the wind turbine blade consisting of shell skins with or without the spar structure.Keywords: resin film infiltration, vacuum assisted resin transfer molding process, wind turbine blade, composite materials
Procedia PDF Downloads 38513296 Statistically Significant Differences of Carbon Dioxide and Carbon Monoxide Emission in Photocopying Process
Authors: Kiurski S. Jelena, Kecić S. Vesna, Oros B. Ivana
Abstract:
Experimental results confirmed the temporal variation of carbon dioxide and carbon monoxide concentration during the working shift of the photocopying process in a small photocopying shop in Novi Sad, Serbia. The statistically significant differences of target gases were examined with two-way analysis of variance without replication followed by Scheffe's post hoc test. The existence of statistically significant differences was obtained for carbon monoxide emission which is pointed out with F-values (12.37 and 31.88) greater than Fcrit (6.94) in contrary to carbon dioxide emission (F-values of 1.23 and 3.12 were less than Fcrit). Scheffe's post hoc test indicated that sampling point A (near the photocopier machine) and second time interval contribute the most on carbon monoxide emission.Keywords: analysis of variance, carbon dioxide, carbon monoxide, photocopying indoor, Scheffe's test
Procedia PDF Downloads 32913295 Design of a Tool for Generating Test Cases from BPMN
Authors: Prat Yotyawilai, Taratip Suwannasart
Abstract:
Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.Keywords: software testing, test case, BPMN, flow graph
Procedia PDF Downloads 55713294 Efficient Position Based Operation Code Authentication
Authors: Hashim Ali, Sheheryar Khan
Abstract:
Security for applications is always been a keen issue of concern. In general, security is to allow access of grant to legal user or to deny non-authorized access to the system. Shoulder surfing is an observation technique to hack an account or to enter into a system. When a malicious observer is capturing or recording the fingers of a user while he is entering sensitive inputs (PIN, Passwords etc.) and may be able to observe user’s password credential. It is very rigorous for a novice user to prevent himself from shoulder surfing or unaided observer in a public place while accessing his account. In order to secure the user account, there are five factors of authentication; they are: “(i) something you have, (ii) something you are, (iii) something you know, (iv) somebody you know, (v) something you process”. A technique has been developed of fifth-factor authentication “something you process” to provide novel approach to the user. In this paper, we have applied position based operational code authentication in such a way to more easy and user friendly to the user.Keywords: shoulder surfing, malicious observer, sensitive inputs, authentication
Procedia PDF Downloads 27313293 The Analysis of Gizmos Online Program as Mathematics Diagnostic Program: A Story from an Indonesian Private School
Authors: Shofiayuningtyas Luftiani
Abstract:
Some private schools in Indonesia started integrating the online program Gizmos in the teaching-learning process. Gizmos was developed to supplement the existing curriculum by integrating it into the instructional programs. The program has some features using an inquiry-based simulation, in which students conduct exploration by using a worksheet while teachers use the teacher guidelines to direct and assess students’ performance In this study, the discussion about Gizmos highlights its features as the assessment media of mathematics learning for secondary school students. The discussion is based on the case study and literature review from the Indonesian context. The purpose of applying Gizmos as an assessment media refers to the diagnostic assessment. As a part of the diagnostic assessment, the teachers review the student exploration sheet, analyze particularly in the students’ difficulties and consider findings in planning future learning process. This assessment becomes important since the teacher needs the data about students’ persistent weaknesses. Additionally, this program also helps to build student’ understanding by its interactive simulation. Currently, the assessment over-emphasizes the students’ answers in the worksheet based on the provided answer keys while students perform their skill in translating the question, doing the simulation and answering the question. Whereas, the assessment should involve the multiple perspectives and sources of students’ performance since teacher should adjust the instructional programs with the complexity of students’ learning needs and styles. Consequently, the approach to improving the assessment components is selected to challenge the current assessment. The purpose of this challenge is to involve not only the cognitive diagnosis but also the analysis of skills and error. Concerning the selected setting for this diagnostic assessment that develops the combination of cognitive diagnosis, skills analysis and error analysis, the teachers should create an assessment rubric. The rubric plays the important role as the guide to provide a set of criteria for the assessment. Without the precise rubric, the teacher potentially ineffectively documents and follows up the data about students at risk of failure. Furthermore, the teachers who employ the program of Gizmos as the diagnostic assessment might encounter some obstacles. Based on the condition of assessment in the selected setting, the obstacles involve the time constrain, the reluctance of higher teaching burden and the students’ behavior. Consequently, the teacher who chooses the Gizmos with those approaches has to plan, implement and evaluate the assessment. The main point of this assessment is not in the result of students’ worksheet. However, the diagnostic assessment has the two-stage process; the process to prompt and effectively follow-up both individual weaknesses and those of the learning process. Ultimately, the discussion of Gizmos as the media of the diagnostic assessment refers to the effort to improve the mathematical learning process.Keywords: diagnostic assessment, error analysis, Gizmos online program, skills analysis
Procedia PDF Downloads 18313292 The Determinants of Co-Production for Value Co-Creation: Quadratic Effects
Authors: Li-Wei Wu, Chung-Yu Wang
Abstract:
Recently, interest has been generated in the search for a new reference framework for value creation that is centered on the co-creation process. Co-creation implies cooperative value creation between service firms and customers and requires the building of experiences as well as the resolution of problems through the combined effort of the parties in the relationship. For customers, values are always co-created through their participation in services. Customers can ultimately determine the value of the service in use. This new approach emphasizes that a customer’s participation in the service process is considered indispensable to value co-creation. An important feature of service in the context of exchange is co-production, which implies that a certain amount of participation is needed from customers to co-produce a service and hence co-create value. Co-production no doubt helps customers better understand and take charge of their own roles in the service process. Thus, this proposal is to encourage co-production, thus facilitating value co-creation of that is reflected in both customers and service firms. Four determinants of co-production are identified in this study, namely, commitment, trust, asset specificity, and decision-making uncertainty. Commitment is an essential dimension that directly results in successful cooperative behaviors. Trust helps establish a relational environment that is fundamental to cross-border cooperation. Asset specificity motivates co-production because this determinant may enhance return on asset investment. Decision-making uncertainty prompts customers to collaborate with service firms in making decisions. In other words, customers adjust their roles and are increasingly engaged in co-production when commitment, trust, asset specificity, and decision-making uncertainty are enhanced. Although studies have examined the preceding effects, to our best knowledge, none has empirically examined the simultaneous effects of all the curvilinear relationships in a single study. When these determinants are excessive, however, customers will not engage in co-production process. In brief, we suggest that the relationships of commitment, trust, asset specificity, and decision-making uncertainty with co-production are curvilinear or are inverse U-shaped. These new forms of curvilinear relationships have not been identified in existing literature on co-production; therefore, they complement extant linear approaches. Most importantly, we aim to consider both the bright and the dark sides of the determinants of co-production.Keywords: co-production, commitment, trust, asset specificity, decision-making uncertainty
Procedia PDF Downloads 18913291 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing
Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig
Abstract:
Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.Keywords: empirical mode decomposition (EMD), mode mixing, sifting process, over-sifting
Procedia PDF Downloads 39813290 Performance, Scalability and Reliability Engineering: Shift Left and Shift Right Approach
Authors: Jyothirmayee Pola
Abstract:
Ideally, a test-driven development (TDD) or agile or any other process should be able to define and implement performance, scalability, and reliability (PSR) of the product with a higher quality of service (QOS) and should have the ability to fix any PSR issues with lesser cost before it hits the production. Most PSR test strategies for new product introduction (NPI) include assumptions about production load requirements but never accurate. NPE (New product Enhancement) include assumptions for new features that are being developed whilst workload distribution for older features can be derived by analyzing production transactions. This paper talks about how to shift left PSR towards design phase of release management process to get better QOS w.r.t PSR for any product under development. It also explains the ROI for future customer onboarding both for Service Oriented Architectures (SOA) and Microservices architectures and how to define PSR requirements.Keywords: component PSR, performance engineering, performance tuning, reliability, return on investment, scalability, system PSR
Procedia PDF Downloads 77