Search results for: standard procedures process
19234 A CPS Based Design of Industrial Ecosystems
Authors: Maryam Shayan
Abstract:
Chemical Process Simulation (CPS) software has been generally utilized by chemical (process) designers to outline, test, advance, and coordinate process plants. It is relied upon that modern scientists to bring these same critical thinking advantages to the outline and operation of industrial ecosystems can utilize CPS. This paper gives modern environment researchers and experts with a prologue to CPS and a review of compound designing configuration standards. The paper highlights late research demonstrating that CPS can be utilized to model modern industrial ecosystems, and talks about the advantages of utilizing CPS to address a portion of the specialized difficulties confronting organizations partaking in an industrial ecosystem. CPS can be utilized to (i) quantitatively assess and analyze the potential ecological and monetary advantages of material and vitality linkages; (ii) unravel general plan, retrofit, or operational issues; (iii) help to distinguish complex and frequently irrational arrangements; and (iv) assess imagine a scenario in which situations. CPS ought to be a valuable expansion to the mechanical environment tool stash.Keywords: chemical process simulation (CPS), process plants, industrial ecosystems, compound designing
Procedia PDF Downloads 28019233 Effect of Injection Moulding Process Parameter on Tensile Strength of Using Taguchi Method
Authors: Gurjeet Singh, M. K. Pradhan, Ajay Verma
Abstract:
The plastic industry plays very important role in the economy of any country. It is generally among the leading share of the economy of the country. Since metals and their alloys are very rarely available on the earth. So to produce plastic products and components, which finds application in many industrial as well as household consumer products is beneficial. Since 50% plastic products are manufactured by injection moulding process. For production of better quality product, we have to control quality characteristics and performance of the product. The process parameters plays a significant role in production of plastic, hence the control of process parameter is essential. In this paper the effect of the parameters selection on injection moulding process has been described. It is to define suitable parameters in producing plastic product. Selecting the process parameter by trial and error is neither desirable nor acceptable, as it is often tends to increase the cost and time. Hence optimization of processing parameter of injection moulding process is essential. The experiments were designed with Taguchi’s orthogonal array to achieve the result with least number of experiments. Here Plastic material polypropylene is studied. Tensile strength test of material is done on universal testing machine, which is produced by injection moulding machine. By using Taguchi technique with the help of MiniTab-14 software the best value of injection pressure, melt temperature, packing pressure and packing time is obtained. We found that process parameter packing pressure contribute more in production of good tensile plastic product.Keywords: injection moulding, tensile strength, poly-propylene, Taguchi
Procedia PDF Downloads 28819232 Platelet Transfusion Thresholds for Pediatrics; A Retrospective Study
Authors: Hessah Alsulami, Majedah Aldosari
Abstract:
Introduction: Platelet threshold of 10x109 /L is recommended for clinically stable thrombocytopenic pediatric patients. Transfusions at a higher level (given the absence of research evidence, as determined by clinical circumstances, generally at threshold of 40x109 /L) may be required for patients with signs of bleeding, high fever, hyper-leukocytosis, rapid fall in platelet count, concomitant coagulation abnormality, critically ill patients, and those with impaired platelet function (including drug induced). Transfusions at a higher level may be also required for patients undergoing invasive procedures. Method: This study is a retrospective observational analysis of platelet transfusion thresholds in a single secondary pediatric hospital in Riyadh. From the blood bank database, the list of the patients who received platelet transfusions in the second half of 2018 was retrieved. Patients were divided into two groups; group A, those belong to the category of high platelet level for transfusion (such as those with bleeding, high fever, rapid fall in platelet count, impaired platelet function or undergoing invasive procedures) and group B, those who were not. Then we looked at the pre and post transfusion platelet levels for each group. The data was analyzed using GraphPad software and the data expressed as Mean ± SD. Result: A total of 112 of transfusion episodes in 61 patients (38% female) were analyzed. The age ranged from 24 days to 8 years. The distribution of platelet transfusion episodes was 64% (n=72) for group A and 36% (n= 40) for group B. The mean pre-transfusion platelet count was 46x103 ± (11x 103) for group A and 28x103 ± (6x103) for group B. the post-transfusion mean platelet count was 61 x 103 ± (14 x 103) and 60 x103 ± (24 x 103) for group A and B respectively. Among the groups the rise in the mean platelet count after transfusion was significant among stable patients (group B) compared to unstable patients (group A) (P < 0.001). Conclusion: The platelet count threshold for transfusion varied with the clinical condition and is higher among unstable patients’ group which is expected. For stable patients the threshold was higher than what it should be which means that the clinicians don’t follow the guidelines in this regard. The rise of platelet count after transfusion was higher among stable patients.Keywords: platelet, transfusion, threshold, pediatric
Procedia PDF Downloads 7119231 An Investigation of the Use of Visible Spectrophotometric Analysis of Lead in an Herbal Tea Supplement
Authors: Salve Alessandria Alcantara, John Armand E. Aquino, Ma. Veronica Aranda, Nikki Francine Balde, Angeli Therese F. Cruz, Elise Danielle Garcia, Antonie Kyna Lim, Divina Gracia Lucero, Nikolai Thadeus Mappatao, Maylan N. Ocat, Jamille Dyanne L. Pajarillo, Jane Mierial A. Pesigan, Grace Kristin Viva, Jasmine Arielle C. Yap, Kathleen Michelle T. Yu, Joanna J. Orejola, Joanna V. Toralba
Abstract:
Lead is a neurotoxic metallic element that is slowly accumulated in bones and tissues especially if present in products taken in a regular basis such as herbal tea supplements. Although sensitive analytical instruments are already available, the USP limit test for lead is still widely used. However, because of its serious shortcomings, Lang Lang and his colleagues developed a spectrophotometric method for determination of lead in all types of samples. This method was the one adapted in this study. The actual procedure performed was divided into three parts: digestion, extraction and analysis. For digestion, HNO3 and CH3COOH were used. Afterwards, masking agents, 0.003% and 0.001% dithizone in CHCl3 were added and used for the extraction. For the analysis, standard addition method and colorimetry were performed. This was done in triplicates under two conditions. The 1st condition, using 25µg/mL of standard, resulted to very low absorbances with an r2 of 0.551. This led to the use of a higher concentration, 1mg/mL, for condition 2. Precipitation of lead cyanide was observed and the absorbance readings were relatively higher but between 0.15-0.25, resulting to a very low r2 of 0.429. LOQ and LOD were not computed due to the limitations of the Milton-Roy Spectrophotometer. The method performed has a shorter digestion time, and used less but more accessible reagents. However, the optimum ratio of dithizone-lead complex must be observed in order to obtain reliable results while exploring other concentration of standards.Keywords: herbal tea supplement, lead-dithizone complex, standard addition, visible spectroscopy
Procedia PDF Downloads 38719230 From Intuitive to Constructive Audit Risk Assessment: A Complementary Approach to CAATTs Adoption
Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy
Abstract:
The use of the audit risk model in auditing has faced limitations and difficulties, leading auditors to rely on a conceptual level of its application. The qualitative approach to assessing risks has resulted in different risk assessments, affecting the quality of audits and decision-making on the adoption of CAATTs. This study aims to investigate risk factors impacting the implementation of the audit risk model and propose a complementary risk-based instrument (KRIs) to form substance risk judgments and mitigate against heightened risk of material misstatement (RMM). The study addresses the question of how risk factors impact the implementation of the audit risk model, improve risk judgments, and aid in the adoption of CAATTs. The study uses a three-stage scale development procedure involving a pretest and subsequent study with two independent samples. The pretest involves an exploratory factor analysis, while the subsequent study employs confirmatory factor analysis for construct validation. Additionally, the authors test the ability of the KRIs to predict audit efforts needed to mitigate against heightened RMM. Data was collected through two independent samples involving 767 participants. The collected data was analyzed using exploratory factor analysis and confirmatory factor analysis to assess scale validity and construct validation. The suggested KRIs, comprising two risk components and seventeen risk items, are found to have high predictive power in determining audit efforts needed to reduce RMM. The study validates the suggested KRIs as an effective instrument for risk assessment and decision-making on the adoption of CAATTs. This study contributes to the existing literature by implementing a holistic approach to risk assessment and providing a quantitative expression of assessed risks. It bridges the gap between intuitive risk evaluation and the theoretical domain, clarifying the mechanism of risk assessments. It also helps improve the uniformity and quality of risk assessments, aiding audit standard-setters in issuing updated guidelines on CAATT adoption. A few limitations and recommendations for future research should be mentioned. First, the process of developing the scale was conducted in the Israeli auditing market, which follows the International Standards on Auditing (ISAs). Although ISAs are adopted in European countries, for greater generalization, future studies could focus on other countries that adopt additional or local auditing standards. Second, this study revealed risk factors that have a material impact on the assessed risk. However, there could be additional risk factors that influence the assessment of the RMM. Therefore, future research could investigate other risk segments, such as operational and financial risks, to bring a broader generalizability to our results. Third, although the sample size in this study fits acceptable scale development procedures and enables drawing conclusions from the body of research, future research may develop standardized measures based on larger samples to reduce the generation of equivocal results and suggest an extended risk model.Keywords: audit risk model, audit efforts, CAATTs adoption, key risk indicators, sustainability
Procedia PDF Downloads 7719229 Using Nonhomogeneous Poisson Process with Compound Distribution to Price Catastrophe Options
Authors: Rong-Tsorng Wang
Abstract:
In this paper, we derive a pricing formula for catastrophe equity put options (or CatEPut) with non-homogeneous loss and approximated compound distributions. We assume that the loss claims arrival process is a nonhomogeneous Poisson process (NHPP) representing the clustering occurrences of loss claims, the size of loss claims is a sequence of independent and identically distributed random variables, and the accumulated loss distribution forms a compound distribution and is approximated by a heavy-tailed distribution. A numerical example is given to calibrate parameters, and we discuss how the value of CatEPut is affected by the changes of parameters in the pricing model we provided.Keywords: catastrophe equity put options, compound distributions, nonhomogeneous Poisson process, pricing model
Procedia PDF Downloads 16719228 Markov Switching of Conditional Variance
Authors: Josip Arneric, Blanka Skrabic Peric
Abstract:
Forecasting of volatility, i.e. returns fluctuations, has been a topic of interest to portfolio managers, option traders and market makers in order to get higher profits or less risky positions. Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most common used models are GARCH type models. As standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance, it is difficult the predict volatility using standard GARCH models. Due to practical limitations of these models different approaches have been proposed in the literature, based on Markov switching models. In such situations models in which the parameters are allowed to change over time are more appropriate because they allow some part of the model to depend on the state of the economy. The empirical analysis demonstrates that Markov switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility for selected emerging markets.Keywords: emerging markets, Markov switching, GARCH model, transition probabilities
Procedia PDF Downloads 45519227 Impact of Flavor on Food Product Quality, A Case Study of Vanillin Stability during Biscuit Preparation
Authors: N. Yang, R. Linforth, I. Fisk
Abstract:
The influence of food processing and choice of flavour solvent was investigated using biscuits prepared with vanillin flavour as an example. Powder vanillin either was added directly into the dough or dissolved into flavour solvent then mixed into the dough. The impact of two commonly used flavour solvents on food quality was compared: propylene glycol (PG) or triacetin (TA). The analytical approach for vanillin detection was developed by chromatography (HPLC-PDA), and the standard extraction method for vanillin was also established. The results indicated the impact of solvent choice on vanillin level during biscuit preparation. After baking, TA as a more heat resistant solvent retained more vanillin than PG, so TA is a better solvent for products that undergo a heating process. The results also illustrated the impact of mixing and baking on vanillin stability in the matrices. The average loss of vanillin was 33% during mixing and 13% during baking, which indicated that the binding of vanillin to fat or flour before baking might cause larger loss than evaporation loss during baking.Keywords: biscuit, flavour stability, food quality, vanillin
Procedia PDF Downloads 50819226 Factors Determining the Vulnerability to Occupational Health Risk and Safety of Call Center Agents in the Philippines
Authors: Lito M. Amit, Venecio U. Ultra, Young-Woong Song
Abstract:
The business process outsourcing (BPO) in the Philippines is expanding rapidly attracting more than 2% of total employment. Currently, the BPO industry is confronted with several issues pertaining to sustainable productivity such as meeting the staffing gap, high rate of employees’ turnover and workforce retention, and the occupational health and safety (OHS) of call center agents. We conducted a survey of OHS programs and health concerns among call center agents in the Philippines and determined the sociocultural factors that affect the vulnerability of call center agents to occupational health risks and hazards. The majority of the agents affirmed that OHS are implemented and OHS orientation and emergency procedures were conducted at employment initiations, perceived favorable and convenient working environment except for occasional noise disturbances and acoustic shock, visual, and voice fatigues. Male agents can easily adjust to the demands and changes in their work environment and flexible work schedules than female agents. Female agents have a higher tendency to be pressured and humiliated by low work performance, experience a higher incidence of emotional abuse, psychological abuse, and experience more physical stress than male agents. The majority of the call center agents had a night-shift schedule and regardless of other factors, night shift work brings higher stress to agents. While working in a call center, higher incidence of headaches and insomnia, burnout, suppressed anger, anxiety, and depressions were experienced by female, younger (21-25 years old) and those at night shift than their counterpart. Most common musculoskeletal disorders include body pain in the neck, shoulders and back; and hand and wrist disorders and these are commonly experienced by female and younger workers. About 30% experienced symptoms of cardiovascular and gastrointestinal disorders and weakened immune systems. Overall, these findings have shown the variable vulnerability by a different subpopulation of call center agents and are important in the occupational health risk prevention and management towards a sustainable human resource for BPO industry in the Philippines.Keywords: business process outsourcing industry, health risk of call center agents, socio-cultural determinants, Philippines
Procedia PDF Downloads 49419225 Different Cognitive Processes in Selecting Spatial Demonstratives: A Cross-Linguistic Experimental Survey
Authors: Yusuke Sugaya
Abstract:
Our research conducts a cross-linguistic experimental investigation into the cognitive processes involved in distance judgment necessary for selecting demonstratives in deictic usage. Speakers may consider the addressee's judgment or apply certain criteria for distance judgment when they produce demonstratives. While it can be assumed that there are language and cultural differences, it remains unclear how these differences manifest across languages. This research conducted online experiments involving speakers of six languages—Japanese, Spanish, Irish, English, Italian, and French—in which a wide variety of drawings were presented on a screen, varying conditions from three perspectives: addressee, comparisons, and standard. The results of the experiments revealed various distinct features associated with demonstratives in each language, highlighting differences from a comparative standpoint. For one thing, there was an influence of a specific reference point (i.e., Standard) on the selection in Japanese and Spanish, whereas there was relatively an influence of competitors in English and Italian.Keywords: demonstratives, cross-linguistic experiment, distance judgment, social cognition
Procedia PDF Downloads 5219224 The Effect of Online Learning During the COVID-19 Pandemic on Student Mental
Authors: Adelia Desi Agnesita
Abstract:
The advent of a new disease called covid-19 made many major changes in the world, one of which is the process of learning and teaching. Learning formerly offline but now is done online, which makes students need adaptation to the learning process. The covid-19 pandemic that occurs almost worldwide causes activities that involve many people to be avoided, one of which is learning to teach. In Indonesia, since March 2020, the process of college learning is turning into online/ long-distance learning. It's to prevent the spread of the covid-19. Student online learning presents some of the obstacles to poor signals, many of the tasks, lack of focus, difficulty sleeping, and resulting stress.Keywords: learning, online, covid-19, pandemic
Procedia PDF Downloads 21419223 Experimental Investigations on the Mechanism of Stratified Liquid Mixing in a Cylinder
Authors: Chai Mingming, Li Lei, Lu Xiaoxia
Abstract:
In this paper, the mechanism of stratified liquids’ mixing in a cylinder is investigated. It is focused on the effects of Rayleigh-Taylor Instability (RTI) and rotation of the cylinder on liquid interface mixing. For miscible liquids, Planar Laser Induced Fluorescence (PLIF) technique is applied to record the concentration field for one liquid. Intensity of Segregation (IOS) is used to describe the mixing status. For immiscible liquids, High Speed Camera is adopted to record the development of the interface. The experiment of RTI indicates that it plays a great role in the mixing process, and meanwhile the large-scale mixing is triggered, and subsequently the span of the stripes decreases, showing that the mesoscale mixing is coming into being. The rotation experiments show that the spin-down process has a great role in liquid mixing, during which the upper liquid falls down rapidly along the wall and crashes into the lower liquid. During this process, a lot of interface instabilities are excited. Liquids mix rapidly in the spin-down process. It can be concluded that no matter what ways have been adopted to speed up liquid mixing, the fundamental reason is the interface instabilities which increase the area of the interface between liquids and increase the relative velocity of the two liquids.Keywords: interface instability, liquid mixing, Rayleigh-Taylor Instability, spin-down process, spin-up process
Procedia PDF Downloads 30119222 Design of Data Management Software System Supporting Rendezvous and Docking with Various Spaceships
Authors: Zhan Panpan, Lu Lan, Sun Yong, He Xiongwen, Yan Dong, Gu Ming
Abstract:
The function of the two spacecraft docking network, the communication and control of a docking target with various spacecrafts is realized in the space lab data management system. In order to solve the problem of the complex data communication mode between the space lab and various spaceships, and the problem of software reuse caused by non-standard protocol, a data management software system supporting rendezvous and docking with various spaceships has been designed. The software system is based on CCSDS Spcecraft Onboard Interface Service(SOIS). It consists of Software Driver Layer, Middleware Layer and Appliaction Layer. The Software Driver Layer hides the various device interfaces using the uniform device driver framework. The Middleware Layer is divided into three lays, including transfer layer, application support layer and system business layer. The communication of space lab plaform bus and the docking bus is realized in transfer layer. Application support layer provides the inter tasks communitaion and the function of unified time management for the software system. The data management software functions are realized in system business layer, which contains telemetry management service, telecontrol management service, flight status management service, rendezvous and docking management service and so on. The Appliaction Layer accomplishes the space lab data management system defined tasks using the standard interface supplied by the Middleware Layer. On the basis of layered architecture, rendezvous and docking tasks and the rendezvous and docking management service are independent in the software system. The rendezvous and docking tasks will be activated and executed according to the different spaceships. In this way, the communication management functions in the independent flight mode, the combination mode of the manned spaceship and the combination mode of the cargo spaceship are achieved separately. The software architecture designed standard appliction interface for the services in each layer. Different requirements of the space lab can be supported by the use of standard services per layer, and the scalability and flexibility of the data management software can be effectively improved. It can also dynamically expand the number and adapt to the protocol of visiting spaceships. The software system has been applied in the data management subsystem of the space lab, and has been verified in the flight of the space lab. The research results of this paper can provide the basis for the design of the data manage system in the future space station.Keywords: space lab, rendezvous and docking, data management, software system
Procedia PDF Downloads 36819221 Changes in Geospatial Structure of Households in the Czech Republic: Findings from Population and Housing Census
Authors: Jaroslav Kraus
Abstract:
Spatial information about demographic processes are a standard part of outputs in the Czech Republic. That was also the case of Population and Housing Census which was held on 2011. This is a starting point for a follow up study devoted to two basic types of households: single person households and households of one completed family. Single person households and one family households create more than 80 percent of all households, but the share and spatial structure is in long-term changing. The increase of single households is results of long-term fertility decrease and divorce increase, but also possibility of separate living. There are regions in the Czech Republic with traditional demographic behavior, and regions like capital Prague and some others with changing pattern. Population census is based - according to international standards - on the concept of currently living population. Three types of geospatial approaches will be used for analysis: (i) firstly measures of geographic distribution, (ii) secondly mapping clusters to identify the locations of statistically significant hot spots, cold spots, spatial outliers, and similar features and (iii) finally analyzing pattern approach as a starting point for more in-depth analyses (geospatial regression) in the future will be also applied. For analysis of this type of data, number of households by types should be distinct objects. All events in a meaningful delimited study region (e.g. municipalities) will be included in an analysis. Commonly produced measures of central tendency and spread will include: identification of the location of the center of the point set (by NUTS3 level); identification of the median center and standard distance, weighted standard distance and standard deviational ellipses will be also used. Identifying that clustering exists in census households datasets does not provide a detailed picture of the nature and pattern of clustering but will be helpful to apply simple hot-spot (and cold spot) identification techniques to such datasets. Once the spatial structure of households will be determined, any particular measure of autocorrelation can be constructed by defining a way of measuring the difference between location attribute values. The most widely used measure is Moran’s I that will be applied to municipal units where numerical ratio is calculated. Local statistics arise naturally out of any of the methods for measuring spatial autocorrelation and will be applied to development of localized variants of almost any standard summary statistic. Local Moran’s I will give an indication of household data homogeneity and diversity on a municipal level.Keywords: census, geo-demography, households, the Czech Republic
Procedia PDF Downloads 9619220 Aerogel Fabrication Via Modified Rapid Supercritical Extraction (RSCE) Process - Needle Valve Pressure Release
Authors: Haibo Zhao, Thomas Andre, Katherine Avery, Alper Kiziltas, Deborah Mielewski
Abstract:
Silica aerogels were fabricated through a modified rapid supercritical extraction (RSCE) process. The silica aerogels were made using a tetramethyl orthosilicate precursor and then placed in a hot press and brought to the supercritical point of the solvent, ethanol. In order to control the pressure release without a pressure controller, a needle valve was used. The resulting aerogels were then characterized for their physical and chemical properties and compared to silica aerogels created using similar methods. The aerogels fabricated using this modified RSCE method were found to have similar properties to those in other papers using the unmodified RSCE method. Silica aerogel infused glass blanket composite, graphene reinforced silica aerogel composite were also successfully fabricated by this new method. The modified RSCE process and system is a prototype for better gas outflow control with a lower cost of equipment setup. Potentially, this process could be evolved to a continuous low-cost high-volume production process to meet automotive requirements.Keywords: aerogel, automotive, rapid supercritical extraction process, low cost production
Procedia PDF Downloads 18419219 Wastewater Treatment Using Ternary Hybrid Advanced Oxidation Processes Through Heterogeneous Fenton
Authors: komal verma, V. S. Moholkar
Abstract:
In this current study, the challenge of effectively treating and mineralizing industrial wastewater prior to its discharge into natural water bodies, such as rivers and lakes, is being addressed. Particularly, the focus is on the wastewater produced by chemical process industries, including refineries, petrochemicals, fertilizer, pharmaceuticals, pesticides, and dyestuff industries. These wastewaters often contain stubborn organic pollutants that conventional techniques, such as microbial processes cannot efficiently degrade. To tackle this issue, a ternary hybrid technique comprising of adsorption, heterogeneous Fenton process, and sonication has been employed. The study aims to evaluate the effectiveness of this approach for treating and mineralizing wastewater from a fertilizer industry located in Northeast India. The study comprises several key components, starting with the synthesis of the Fe3O4@AC nanocomposite using the co-precipitation method. The nanocomposite is then subjected to comprehensive characterization through various standard techniques, including FTIR, FE-SEM, EDX, TEM, BET surface area analysis, XRD, and magnetic property determination using VSM. Next, the process parameters of wastewater treatment are statistically optimized, focusing on achieving a high level of COD (Chemical Oxygen Demand) removal as the response variable. The Fe3O4@AC nanocomposite's adsorption characteristics and kinetics are also assessed in detail. The remarkable outcome of this study is the successful application of the ternary hybrid technique, combining adsorption, Fenton process, and sonication. This approach proves highly effective, leading to nearly complete mineralization (or TOC removal) of the fertilizer industry wastewater. The results highlight the potential of the Fe3O4@AC nanocomposite and the ternary hybrid technique as a promising solution for tackling challenging wastewater pollutants from various chemical process industries. This paper reports investigations in the mineralization of industrial wastewater (COD = 3246 mg/L, TOC = 2500 mg/L) using a ternary (ultrasound + Fenton + adsorption) hybrid advanced oxidation process. Fe3O4 decorated activated charcoal (Fe3O4@AC) nanocomposites (surface area = 538.88 m2/g; adsorption capacity = 294.31 mg/g) were synthesized using co-precipitation. The wastewater treatment process was optimized using central composite statistical design. At optimum conditions, viz. pH = 4.2, H2O2 loading = 0.71 M, adsorbent dose = 0.34 g/L, reduction in COD and TOC of wastewater were 94.75% and 89%, respectively. This result results from synergistic interactions among the adsorption of pollutants onto activated charcoal and surface Fenton reactions induced due to the leaching of Fe2+/Fe3+ ions from the Fe3O4 nanoparticles. Micro-convection generated due to sonication assisted faster mass transport (adsorption/desorption) of pollutants between Fe3O4@AC nanocomposite and the solution. The net result of this synergism was high interactions and reactions among and radicals and pollutants that resulted in the effective mineralization of wastewater. The Fe3O4@AC showed excellent recovery (> 90 wt%) and reusability (> 90% COD removal) in 5 successive cycles of treatment. LC-MS analysis revealed effective (> 50%) degradation of more than 25 significant contaminants (in the form of herbicides and pesticides) after the treatment with ternary hybrid AOP. Similarly, the toxicity analysis test using the seed germination technique revealed ~ 60% reduction in the toxicity of the wastewater after treatment.Keywords: chemical oxygen demand (cod), fe3o4@ac nanocomposite, kinetics, lc-ms, rsm, toxicity
Procedia PDF Downloads 7219218 A Survey of 2nd Year Students' Frequent Writing Error and the Effects of Participatory Error Correction Process
Authors: Chaiwat Tantarangsee
Abstract:
The purposes of this study are 1) to study the effects of participatory error correction process and 2) to find out the students’ satisfaction of such error correction process. This study is a Quasi Experimental Research with single group, in which data is collected 5 times preceding and following 4 experimental studies of participatory error correction process including providing coded indirect corrective feedback in the students’ texts with error treatment activities. Samples include 28 2nd year English Major students, Faculty of Humanities and Social Sciences, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tools for data collection include 5 writing tests of short texts and a questionnaire. Based on formative evaluation of the students’ writing ability prior to and after each of the 4 experiments, the research findings disclose the students’ higher scores with statistical difference at 0.05. Moreover, in terms of the effect size of such process, it is found that for mean of the students’ scores prior to and after the 4 experiments; d equals 1.0046, 1.1374, 1.297, and 1.0065 respectively. It can be concluded that participatory error correction process enables all of the students to learn equally well and there is improvement in their ability to write short texts. Finally, the students’ overall satisfaction of the participatory error correction process is in high level (Mean=4.32, S.D.=0.92).Keywords: coded indirect corrective feedback, participatory error correction process, error treatment, humanities and social sciences
Procedia PDF Downloads 52319217 Appropriate Depth of Needle Insertion during Rhomboid Major Trigger Point Block
Authors: Seongho Jang
Abstract:
Objective: To investigate an appropriate depth of needle insertion during trigger point injection into the rhomboid major muscle. Methods: Sixty-two patients who visited our department with shoulder or upper back pain participated in this study. The distance between the skin and the rhomboid major muscle (SM) and the distance between the skin and rib (SB) were measured using ultrasonography. The subjects were divided into 3 groups according to BMI: BMI less than 23 kg/m2 (underweight or normal group); 23 kg/m2 or more to less than 25 kg/m2 (overweight group); and 25 kg/m2 or more (obese group). The mean ±standard deviation (SD) of SM and SB of each group were calculated. A range between mean+1 SD of SM and the mean-1 SD of SB was defined as a safe margin. Results: The underweight or normal group’s SM, SB, and the safe margin were 1.2±0.2, 2.1±0.4, and 1.4 to 1.7 cm, respectively. The overweight group’s SM and SB were 1.4±0.2 and 2.4±0.9 cm, respectively. The safe margin could not be calculated for this group. The obese group’s SM, SB, and the safe margin were 1.8±0.3, 2.7±0.5, and 2.1 to 2.2 cm, respectively. Conclusion: This study will help us to set the standard depth of safe needle insertion into the rhomboid major muscle in an effective manner without causing any complications.Keywords: pneumothorax, rhomboid major muscle, trigger point injection, ultrasound
Procedia PDF Downloads 29019216 Profiling of Apoptotic Protein Expressions after Trabectedin Treatment in Human Prostate Cancer Cell Line PC-3 by Protein Array Technology
Authors: Harika Atmaca, Emir Bozkurt, Latife Merve Oktay, Selim Uzunoglu, Ruchan Uslu, Burçak Karaca
Abstract:
Microarrays have been developed for highly parallel enzyme-linked immunosorbent assay (ELISA) applications. The most common protein arrays are produced by using multiple monoclonal antibodies, since they are robust molecules which can be easily handled and immobilized by standard procedures without loss of activity. Protein expression profiling with protein array technology allows simultaneous analysis of the protein expression pattern of a large number of proteins. Trabectedin, a tetrahydroisoquinoline alkaloid derived from a Caribbean tunicate, Ecteinascidia turbinata, has been shown to have antitumor effects. Here, we used a novel proteomic approach to explore the mechanism of action of trabectedin in prostate cancer cell line PC-3 by apoptosis antibody microarray. XTT cell proliferation kit and Cell Death Detection Elisa Plus Kit (Roche) was used for measuring cytotoxicity and apoptosis. Human Apoptosis Protein Array (R&D Systems) which consists of 35 apoptosis related proteins was used to assess the omic protein expression pattern. Trabectedin induced cytotoxicity and apoptosis in prostate cancer cells in a time and concentration-dependent manner. The expression levels of the death receptor pathway molecules, TRAIL-R1/DR4, TRAIL R2/DR5, TNF R1/TNFRSF1A, FADD were significantly increased by 4.0-, 21.0-, 4.20- and 11.5-fold by trabectedin treatment in PC-3 cells. Moreover, mitochondrial pathway related pro-apoptotic proteins Bax, Bad, Cytochrome c, and Cleaved Caspase-3 expressions were induced by 2.68-, 2.07-, 2.8-, and 4.5-fold and the expression levels of anti-apoptotic proteins Bcl-2 and Bcl-XL were reduced by 3.5- and 5.2-fold in PC-3 cells. Proteomic (antibody microarray) analysis suggests that the mechanism of action of trabectedin may be exerted via the induction of both intrinsic and extrinsic apoptotic pathways. The antibody microarray platform can be utilised to explore the molecular mechanism of action of novel anticancer agents.Keywords: trabectedin, prostate cancer, omic protein expression profile, apoptosis
Procedia PDF Downloads 44219215 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 27919214 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence
Procedia PDF Downloads 44419213 Climate Change Adaptation in the U.S. Coastal Zone: Data, Policy, and Moving Away from Moral Hazard
Authors: Thomas Ruppert, Shana Jones, J. Scott Pippin
Abstract:
State and federal government agencies within the United States have recently invested substantial resources into studies of future flood risk conditions associated with climate change and sea-level rise. A review of numerous case studies has uncovered several key themes that speak to an overall incoherence within current flood risk assessment procedures in the U.S. context. First, there are substantial local differences in the quality of available information about basic infrastructure, particularly with regard to local stormwater features and essential facilities that are fundamental components of effective flood hazard planning and mitigation. Second, there can be substantial mismatch between regulatory Flood Insurance Rate Maps (FIRMs) as produced by the National Flood Insurance Program (NFIP) and other 'current condition' flood assessment approaches. This is of particular concern in areas where FIRMs already seem to underestimate extant flood risk, which can only be expected to become a greater concern if future FIRMs do not appropriately account for changing climate conditions. Moreover, while there are incentives within the NFIP’s Community Rating System (CRS) to develop enhanced assessments that include future flood risk projections from climate change, the incentive structures seem to have counterintuitive implications that would tend to promote moral hazard. In particular, a technical finding of higher future risk seems to make it easier for a community to qualify for flood insurance savings, with much of these prospective savings applied to individual properties that have the most physical risk of flooding. However, there is at least some case study evidence to indicate that recognition of these issues is prompting broader discussion about the need to move beyond FIRMs as a standalone local flood planning standard. The paper concludes with approaches for developing climate adaptation and flood resilience strategies in the U.S. that move away from the social welfare model being applied through NFIP and toward more of an informed risk approach that transfers much of the investment responsibility over to individual private property owners.Keywords: climate change adaptation, flood risk, moral hazard, sea-level rise
Procedia PDF Downloads 10819212 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs
Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao
Abstract:
A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.Keywords: tight reservoirs, cyclic O₂ injection, asphaltene, solubility, reservoir simulation
Procedia PDF Downloads 38619211 Risk Management in Industrial Supervision Projects
Authors: Érick Aragão Ribeiro, George André Pereira Thé, José Marques Soares
Abstract:
Several problems in industrial supervision software development projects may lead to the delay or cancellation of projects. These problems can be avoided or contained by using identification methods, analysis and control of risks. These procedures can give an overview of the possible problems that can happen in the projects and what are the immediate solutions. Therefore, we propose a risk management method applied to the teaching and development of industrial supervision software. The method is developed through a literature review and previous projects can be divided into phases of management and have basic features that are validated with experimental research carried out by mechatronics engineering students and professionals. The management is conducted through the stages of identification, analysis, planning, monitoring, control and communication of risks. Programmers use a method of prioritizing risks considering the gravity and the possibility of occurrence of the risk. The outputs of the method indicate which risks occurred or are about to happen. The first results indicate which risks occur at different stages of the project and what risks have a high probability of occurring. The results show the efficiency of the proposed method compared to other methods, showing the improvement of software quality and leading developers in their decisions. This new way of developing supervision software helps students identify design problems, evaluate software developed and propose effective solutions. We conclude that the risk management optimizes the development of the industrial process control software and provides higher quality to the product.Keywords: supervision software, risk management, industrial supervision, project management
Procedia PDF Downloads 35619210 Iterative Reconstruction Techniques as a Dose Reduction Tool in Pediatric Computed Tomography Imaging: A Phantom Study
Authors: Ajit Brindhaban
Abstract:
Background and Purpose: Computed Tomography (CT) scans have become the largest source of radiation in radiological imaging. The purpose of this study was to compare the quality of pediatric Computed Tomography (CT) images reconstructed using Filtered Back Projection (FBP) with images reconstructed using different strengths of Iterative Reconstruction (IR) technique, and to perform a feasibility study to assess the use of IR techniques as a dose reduction tool. Materials and Methods: An anthropomorphic phantom representing a 5-year old child was scanned, in two stages, using a Siemens Somatom CT unit. In stage one, scans of the head, chest and abdomen were performed using standard protocols recommended by the scanner manufacturer. Images were reconstructed using FBP and 5 different strengths of IR. Contrast-to-Noise Ratios (CNR) were calculated from average CT number and its standard deviation measured in regions of interest created in the lungs, bone, and soft tissues regions of the phantom. Paired t-test and the one-way ANOVA were used to compare the CNR from FBP images with IR images, at p = 0.05 level. The lowest strength value of IR that produced the highest CNR was identified. In the second stage, scans of the head was performed with decreased mA(s) values relative to the increase in CNR compared to the standard FBP protocol. CNR values were compared in this stage using Paired t-test at p = 0.05 level. Results: Images reconstructed using IR technique had higher CNR values (p < 0.01.) in all regions compared to the FBP images, at all strengths of IR. The CNR increased with increasing IR strength of up to 3, in the head and chest images. Increases beyond this strength were insignificant. In abdomen images, CNR continued to increase up to strength 5. The results also indicated that, IR techniques improve CNR by a up to factor of 1.5. Based on the CNR values at strength 3 of IR images and CNR values of FBP images, a reduction in mA(s) of about 20% was identified. The images of the head acquired at 20% reduced mA(s) and reconstructed using IR at strength 3, had similar CNR as FBP images at standard mA(s). In the head scans of the phantom used in this study, it was demonstrated that similar CNR can be achieved even when the mA(s) is reduced by about 20% if IR technique with strength of 3 is used for reconstruction. Conclusions: The IR technique produced better image quality at all strengths of IR in comparison to FBP. IR technique can provide approximately 20% dose reduction in pediatric head CT while maintaining the same image quality as FBP technique.Keywords: filtered back projection, image quality, iterative reconstruction, pediatric computed tomography imaging
Procedia PDF Downloads 14819209 Defect Management Life Cycle Process for Software Quality Improvement
Authors: Aedah Abd Rahman, Nurdatillah Hasim
Abstract:
Software quality issues require special attention especially in view of the demands of quality software product to meet customer satisfaction. Software development projects in most organisations need proper defect management process in order to produce high quality software product and reduce the number of defects. The research question of this study is how to produce high quality software and reducing the number of defects. Therefore, the objective of this paper is to provide a framework for managing software defects by following defined life cycle processes. The methodology starts by reviewing defects, defect models, best practices and standards. A framework for defect management life cycle is proposed. The major contribution of this study is to define a defect management road map in software development. The adoption of an effective defect management process helps to achieve the ultimate goal of producing high quality software products and contributes towards continuous software process improvement.Keywords: defects, defect management, life cycle process, software quality
Procedia PDF Downloads 30619208 Quantitative and Fourier Transform Infrared Analysis of Saponins from Three Kenyan Ruellia Species: Ruellia prostrata, Ruellia lineari-bracteolata and Ruellia bignoniiflora
Authors: Christine O. Wangia, Jennifer A. Orwa, Francis W. Muregi, Patrick G. Kareru, Kipyegon Cheruiyot, Eric Guantai
Abstract:
Ruellia (syn. Dipteracanthus) species are wild perennial creepers belonging to the Acanthaceae family. These species are reported to possess anti-inflammatory, analgesic, antioxidant, gastroprotective, anticancer, and immuno-stimulant properties. Phytochemical screening of both aqueous and methanolic extracts of Ruellia species revealed the presence of saponins. Saponins have been reported to possess anti-inflammatory, antioxidant, immuno-stimulant, antihepatotoxic, antibacterial, anticarcinogenic, and antiulcerogenic activities. The objective of this study was to quantify and analyze the Fourier transform infrared (FTIR) spectra of saponins in crude extracts of three Kenyan Ruellia species namely Ruellia prostrata (RPM), Ruellia lineari-bracteolata (RLB) and Ruellia bignoniiflora (RBK). Sequential organic extraction of the ground whole plant material was done using petroleum ether (PE), chloroform, ethyl acetate (EtOAc), and absolute methanol by cold maceration, while aqueous extraction was by hot maceration. The plant powders and extracts were mixed with spectroscopic grade KBr and compressed into a pellet. The infrared spectra were recorded using a Shimadzu FTIR spectrophotometer of 8000 series in the range of 3500 cm-1 - 500 cm-1. Quantitative determination of the saponins was done using standard procedures. Quantitative analysis of saponins showed that RPM had the highest quantity of crude saponins (2.05% ± 0.03), followed by RLB (1.4% ± 0.15) and RBK (1.25% ± 0.11), respectively. FTIR spectra revealed the spectral peaks characteristic for saponins in RPM, RLB, and RBK plant powders, aqueous and methanol extracts; O-H absorption (3265 - 3393 cm-1), C-H absorption ranging from 2851 to 2924 cm-1, C=C absorbance (1628 - 1655 cm-1), oligosaccharide linkage (C-O-C) absorption due to sapogenins (1036 - 1042 cm-1). The crude saponins from RPM, RLB and RBK showed similar peaks to their respective extracts. The presence of the saponins in extracts of RPM, RLB and RBK may be responsible for some of the biological activities reported in the Ruellia species.1Keywords: Ruellia bignoniiflora, Ruellia linearibracteolata, Ruellia prostrata, Saponins
Procedia PDF Downloads 18119207 Kazakh Language Assessment in a New Multilingual Kazakhstan
Authors: Karlygash Adamova
Abstract:
This article is focused on the KazTest as one of the most important high-stakes tests and the key tool in Kazakh language assessment. The research will also include the brief introduction to the language policy in Kazakhstan. Particularly, it is going to be changed significantly and turn from bilingualism (Kazakh, Russian) to multilingual policy (three languages - Kazakh, Russian, English). Therefore, the current status of the abovementioned languages will be described. Due to the various educational reforms in the country, the language evaluation system should also be improved and moderated. The research will present the most significant test of Kazakhstan – the KazTest, which is aimed to evaluate the Kazakh language proficiency. Assessment is an ongoing process that encompasses a wide area of knowledge upon the productive performance of the learners. Test is widely defined as a standardized or standard method of research, testing, diagnostics, verification, etc. The two most important characteristics of any test, as the main element of the assessment - validity and reliability - will also be described in this paper. Therefore, the preparation and design of the test, which is assumed to be an indicator of knowledge, and it is highly important to take into account all these properties.Keywords: multilingualism, language assessment, testing, language policy
Procedia PDF Downloads 13619206 Treadmill Negotiation: The Stagnation of the Israeli – Palestinian Peace Process
Authors: Itai Kohavi, Wojciech Nowiak
Abstract:
This article explores the stagnation of the Israeli -Palestinian peace negotiation process, and the reasons behind the failure of more than 12 international initiatives to resolve the conflict. Twenty-seven top members of the Israeli national security elite (INSE) were interviewed, including heads of the negotiation teams, the National Security Council, the Mossad, and other intelligence and planning arms. The interviewees provided their insights on the Israeli challenges in reaching a sustainable and stable peace agreement and in dealing with the international pressure on Israel to negotiate a peace agreement while preventing anti-Israeli UN decisions and sanctions. The findings revealed a decision tree, with red herring deception strategies implemented to postpone the negotiation process and to delay major decisions during the negotiation process. Beyond the possible applications for the Israeli – Palestinian conflict, the findings shed more light on the phenomenon of rational deception of allies in a negotiation process, a subject less frequently researched as compared with deception of rivals.Keywords: deception, Israeli-Palestinian conflict, negotiation, red herring, terrorist state, treadmill negotiation
Procedia PDF Downloads 30319205 Distribution-Free Exponentially Weighted Moving Average Control Charts for Monitoring Process Variability
Authors: Chen-Fang Tsai, Shin-Li Lu
Abstract:
Distribution-free control chart is an oncoming area from the statistical process control charts in recent years. Some researchers have developed various nonparametric control charts and investigated the detection capability of these charts. The major advantage of nonparametric control charts is that the underlying process is not specifically considered the assumption of normality or any parametric distribution. In this paper, two nonparametric exponentially weighted moving average (EWMA) control charts based on nonparametric tests, namely NE-S and NE-M control charts, are proposed for monitoring process variability. Generally, weighted moving average (GWMA) control charts are extended by utilizing design and adjustment parameters for monitoring the changes in the process variability, namely NG-S and NG-M control charts. Statistical performance is also investigated on NG-S and NG-M control charts with run rules. Moreover, sensitivity analysis is performed to show the effects of design parameters under the nonparametric NG-S and NG-M control charts.Keywords: Distribution-free control chart, EWMA control charts, GWMA control charts
Procedia PDF Downloads 272