Search results for: conventional computing
4236 Effectiveness of Shock Wave Therapy Versus Intermittent Mechanical Traction on Mechanical Low Back Pain and Disabilities
Authors: Ahmed Assem Abd El Rahim
Abstract:
Background: Mechanical low back pain is serious physical and social health problem. Purpose: To examine impact of shock wave therapy versus intermittent mechanical traction on mechanical LBP, and disabilities. Subjects: 60 mechanical LBP male studied cases years old 20-35 years were assigned randomly into 3 groups, Picked up from Sohag university orthopedic hospital outpatient clinic. Methods: (Study Group) A: 20 studied cases underwent shock wave therapy plus conventional physical therapy. (Study Group) B: twenty studied cases underwent intermittent mechanical traction plus conventional physical therapy. (Control Group) C: 20 patients underwent conventional physical therapy alone. Three sessions were applied weekly for four weeks. Pain was quantified using McGill Pain Questionnaire, Roland Morris Disability Questionnaire was used for measuring disability, and the ROM was evaluated by (BROM) device pre- & post-therapy. Results: Groups (A, B & C) found a reduction in pain & disability & rise in their in flexion and extension ROM after end of 4 weeks of program. Mean values of pain scale after therapy were 15.3, 9.47, and 23.07 in groups A, B, & C. mean values of Disability scale after therapy were 8.44, 4.87, 11.8in groups A, B & C. mean values of ROM of flexion were 25.53, 29.06, & 23.9 in groups A, B & C. mean values of ROM of extension were 11.73, 15.53 & 9.85 in groups A, B & C. studied cases who received intermittent mechanical traction & conventional physical therapy (group B), found reduction in pain & disability & improvement in ROM of flexion & extension value (P<0.001) after therapy program. Conclusion: Shock wave therapy and intermittent mechanical traction, as well as conventional physical treatment, can be beneficial in studied cases with mechanical LBP.Keywords: mechanical low back pain, shock wave, mechanical, low back pain
Procedia PDF Downloads 544235 An Experimental Study on Heat and Flow Characteristics of Water Flow in Microtube
Authors: Zeynep Küçükakça, Nezaket Parlak, Mesut Gür, Tahsin Engin, Hasan Küçük
Abstract:
In the current research, the single phase fluid flow and heat transfer characteristics are experimentally investigated. The experiments are conducted to cover transition zone for the Reynolds numbers ranging from 100 to 4800 by fused silica and stainless steel microtubes having diameters of 103-180 µm. The applicability of the Logarithmic Mean Temperature Difference (LMTD) method is revealed and an experimental method is developed to calculate the heat transfer coefficient. Heat transfer is supplied by a water jacket surrounding the microtubes and heat transfer coefficients are obtained by LMTD method. The results are compared with data obtained by the correlations available in the literature in the study. The experimental results indicate that the Nusselt numbers of microtube flows do not accord with the conventional results when the Reynolds number is lower than 1000. After that, the Nusselt number approaches the conventional theory prediction. Moreover, the scaling effects in micro scale such as axial conduction, viscous heating and entrance effects are discussed. On the aspect of fluid characteristics, the friction factor is well predicted with conventional theory and the conventional friction prediction is valid for water flow through microtube with a relative surface roughness less than about 4 %.Keywords: microtube, laminar flow, friction factor, heat transfer, LMTD method
Procedia PDF Downloads 4604234 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays
Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal
Abstract:
Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).Keywords: fault tolerance, FPGA, single event upset, approximate computing
Procedia PDF Downloads 1984233 Islamic Financial Instrument, Standard Parallel Salam as an Alternative to Conventional Derivatives
Authors: Alireza Naserpoor
Abstract:
Derivatives are the most important innovation which has happened in the past decades. When it comes to financial markets, it has changed the whole way of operations of stock, commodities and currency market. Beside a lot of advantages, Conventional derivatives contracts have some disadvantages too. Some problems have been caused by derivatives contain raising Volatility, increasing Bankruptcies and causing financial crises. Standard Parallel Salam contract as an Islamic financial product meanwhile is a financing instrument can be used for risk management by investors. Standard Parallel Salam is a Shari’ah-Compliant contract. Furthermore, it is an alternative to conventional derivatives. Despite the fact that the unstructured types of that, has been used in several Islamic countries, This contract as a structured and standard financial instrument introduced in Iran Mercantile Exchange in 2014. In this paper after introducing parallel Salam, we intend to examine a collection of international experience and local measure regarding launching standard parallel Salam contract and proceed to describe standard scenarios for trading this instrument and practical experience in Iran Mercantile Exchange about this instrument. Afterwards, we make a comparison between SPS and Futures contracts as a conventional derivative. Standard parallel salam contract as an Islamic financial product, can be used for risk management by investors. SPS is a Shariah-Compliant contract. Furthermore it is an alternative to conventional derivatives. This contract as a structured and standard financial instrument introduced in Iran Mercantile Exchange in 2014. despite the fact that the unstructured types of that, has been used in several Islamic countries. In this article after introducing parallel salam, we intend to examine a collection of international experience and local measure regarding launching standard parallel salam contract and proceed to describe standard scenarios for trading this instrument containing two main approaches in SPS using, And practical experience in IME about this instrument Afterwards, a comparison between SPS and Futures contracts as a conventional derivatives.Keywords: futures contracts, hedging, shari’ah compliant instruments, standard parallel salam
Procedia PDF Downloads 3904232 Clinical Effectiveness of Bulk-fill Resin Composite: A Review
Authors: Taraneh Estedlal
Abstract:
The objective of this study was to review in-vivo and in-vitro studies to compare the effectiveness of bulk-fill and conventional resin composites with regard to marginal adaptation, polymerization shrinkage, and other mechanical properties.PubMed and Scopus databases was investigated for in-vitro studies and randomized clinical trials comparing incidence of fractures, color stability, marginal adaptation, pain and discomfort, recurrent caries, occlusion, pulpal reaction, and proper proximal contacts of restorations made with conventional and bulk resins. The failure rate of conventional and flowable bulk-fill resin composites was not significantly different to sculptable bulk-fill resin composites. The objective of this study was to review in-vivo and in-vitro studies to compare the effectiveness of bulk-fill and conventional resin composites with regard to marginal adaptation, polymerization shrinkage, and other mechanical properties. PubMed and Scopus databases was investigated for in-vitro studies and randomized clinical trials comparing one of the pearlier mentioned properties between bulk-fill and control composites. Despite differences in physical and in-vitro properties, failure rate of conventional and flowable bulk-fill resin composites was not significantly different to sculptable bulk-fill resin composites.Keywords: polymerization shrinkage, color stability, marginal adaptation, recurrent caries, occlusion, pulpal reaction
Procedia PDF Downloads 1454231 Spatial-Temporal Awareness Approach for Extensive Re-Identification
Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush
Abstract:
Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.Keywords: long-short-term memory, re-identification, security critical application, spatial-temporal awareness
Procedia PDF Downloads 1124230 Cloud Shield: Model to Secure User Data While Using Content Delivery Network Services
Authors: Rachna Jain, Sushila Madan, Bindu Garg
Abstract:
Cloud computing is the key powerhouse in numerous organizations due to shifting of their data to the cloud environment. In recent years it has been observed that cloud-based-services are being used on large scale for content storage, distribution and processing. Various issues have been observed in cloud computing environment that need to be addressed. Security and privacy are found topmost concern area. In this paper, a novel security model is proposed to secure data by utilizing CDN services like image to icon conversion. CDN Service is a content delivery service which converts an image to icon, word to pdf & Latex to pdf etc. Presented model is used to convert an image into icon by keeping image secret. Here security of image is imparted so that image should be encrypted and decrypted by data owners only. It is also discussed in the paper that how server performs multiplication and selection on encrypted data without decryption. The data can be image file, word file, audio or video file. Moreover, the proposed model is capable enough to multiply images, encrypt them and send to a server application for conversion. Eventually, the prime objective is to encrypt an image and convert the encrypted image to image Icon by utilizing homomorphic encryption.Keywords: cloud computing, user data security, homomorphic encryption, image multiplication, CDN service
Procedia PDF Downloads 3344229 D-Wave Quantum Computing Ising Model: A Case Study for Forecasting of Heat Waves
Authors: Dmytro Zubov, Francesco Volponi
Abstract:
In this paper, D-Wave quantum computing Ising model is used for the forecasting of positive extremes of daily mean air temperature. Forecast models are designed with two to five qubits, which represent 2-, 3-, 4-, and 5-day historical data respectively. Ising model’s real-valued weights and dimensionless coefficients are calculated using daily mean air temperatures from 119 places around the world, as well as sea level (Aburatsu, Japan). In comparison with current methods, this approach is better suited to predict heat wave values because it does not require the estimation of a probability distribution from scarce observations. Proposed forecast quantum computing algorithm is simulated based on traditional computer architecture and combinatorial optimization of Ising model parameters for the Ronald Reagan Washington National Airport dataset with 1-day lead-time on learning sample (1975-2010 yr). Analysis of the forecast accuracy (ratio of successful predictions to total number of predictions) on the validation sample (2011-2014 yr) shows that Ising model with three qubits has 100 % accuracy, which is quite significant as compared to other methods. However, number of identified heat waves is small (only one out of nineteen in this case). Other models with 2, 4, and 5 qubits have 20 %, 3.8 %, and 3.8 % accuracy respectively. Presented three-qubit forecast model is applied for prediction of heat waves at other five locations: Aurel Vlaicu, Romania – accuracy is 28.6 %; Bratislava, Slovakia – accuracy is 21.7 %; Brussels, Belgium – accuracy is 33.3 %; Sofia, Bulgaria – accuracy is 50 %; Akhisar, Turkey – accuracy is 21.4 %. These predictions are not ideal, but not zeros. They can be used independently or together with other predictions generated by different method(s). The loss of human life, as well as environmental, economic, and material damage, from extreme air temperatures could be reduced if some of heat waves are predicted. Even a small success rate implies a large socio-economic benefit.Keywords: heat wave, D-wave, forecast, Ising model, quantum computing
Procedia PDF Downloads 4974228 Sensory Evaluation of Meat from Broilers Bird Fed Detoxified Jatropher Curcas and that Fed Conventional Feed
Authors: W. S. Lawal, T. A. Akande
Abstract:
Four (4) different methods were employed to detoxified jatropha caucas, they are physical method (if include soaking and drying) chemical method (use of methylated spirit, hexane and methene) biological method,(use of Aspergillus niger and Sunday for 7 days and then baccillus lichifarming) and finally combined method (combination of all these methods). Phobol esther andysis was carried out after the detoxification and was found that combined method is better off (P>0.05). 100 broiler birds was used to further test the effect of detoxified Jatropha by combined method, 50 birds for Jatropha made feed at 10 birds per treatment and was replicated five times, this was also repeated for another 50 birds fed conventional feed, Jatropha made feed was compranded at 8% inclusion level. At the end of the 8th weeks, 8 birds were sacrificed each from each treatment and one bird each was fry, roast, boil and grilled from both conventional and Jatropha fed birds and panelist were served for evaluation. It was found that feeding Jatropha to poultry birds has no effect on the taste of the meat.Keywords: phobol esther, inclusion level, tolerance level, Jatropha carcass
Procedia PDF Downloads 4224227 Audio Information Retrieval in Mobile Environment with Fast Audio Classifier
Authors: Bruno T. Gomes, José A. Menezes, Giordano Cabral
Abstract:
With the popularity of smartphones, mobile apps emerge to meet the diverse needs, however the resources at the disposal are limited, either by the hardware, due to the low computing power, or the software, that does not have the same robustness of desktop environment. For example, in automatic audio classification (AC) tasks, musical information retrieval (MIR) subarea, is required a fast processing and a good success rate. However the mobile platform has limited computing power and the best AC tools are only available for desktop. To solve these problems the fast classifier suits, to mobile environments, the most widespread MIR technologies, seeking a balance in terms of speed and robustness. At the end we found that it is possible to enjoy the best of MIR for mobile environments. This paper presents the results obtained and the difficulties encountered.Keywords: audio classification, audio extraction, environment mobile, musical information retrieval
Procedia PDF Downloads 5444226 A Novel Approach to Design and Implement Context Aware Mobile Phone
Authors: G. S. Thyagaraju, U. P. Kulkarni
Abstract:
Context-aware computing refers to a general class of computing systems that can sense their physical environment, and adapt their behaviour accordingly. Context aware computing makes systems aware of situations of interest, enhances services to users, automates systems and personalizes applications. Context-aware services have been introduced into mobile devices, such as PDA and mobile phones. In this paper we are presenting a novel approaches used to realize the context aware mobile. The context aware mobile phone (CAMP) proposed in this paper senses the users situation automatically and provides user context required services. The proposed system is developed by using artificial intelligence techniques like Bayesian Network, fuzzy logic and rough sets theory based decision table. Bayesian Network to classify the incoming call (high priority call, low priority call and unknown calls), fuzzy linguistic variables and membership degrees to define the context situations, the decision table based rules for service recommendation. To exemplify and demonstrate the effectiveness of the proposed methods, the context aware mobile phone is tested for college campus scenario including different locations like library, class room, meeting room, administrative building and college canteen.Keywords: context aware mobile, fuzzy logic, decision table, Bayesian probability
Procedia PDF Downloads 3654225 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 1654224 Performance of Modified Wedge Anchorage System for Pre-Stressed FRP Bars
Authors: Othman S. Alsheraida, Sherif El-Gamal
Abstract:
Fiber Reinforced Polymers (FRP) is a composite material with exceptional properties that are capable of replacing conventional steel reinforcement in reinforced and pre-stressed concrete structures. However, the main obstacle for their wide use in the pre-stressed concrete application is the anchorage system. Due to the weakness of FRP in the transverse direction, the pre-stressing capacity of FRP bars is limited. This paper investigates the modification of the conventional wedge anchorage system to be used for stressing of FRP bars in pre-stressed applications. Epoxy adhesive material with glass FRP (GFRP) bars and conventional steel wedge were used in this paper. The GFRP bars are encased with epoxy at the anchor zone and the wedge system was used in the pull-out test. The results showed a loading capacity of 47.6 kN which is 69% of the bar ultimate capacity. Additionally, nylon wedge was made with the same dimensions of the steel wedge and tested for GFRP bars without epoxy layer. The nylon wedge showed a loading capacity of 19.7 kN which is only 28.5% of the ultimate bar capacity.Keywords: anchorage, concrete, epoxy, frp, pre-stressed
Procedia PDF Downloads 2964223 Waste Management Option for Bioplastics Alongside Conventional Plastics
Authors: Dan Akesson, Gauthaman Kuzhanthaivelu, Martin Bohlen, Sunil K. Ramamoorthy
Abstract:
Bioplastics can be defined as polymers derived partly or completely from biomass. Bioplastics can be biodegradable such as polylactic acid (PLA) and polyhydroxyalkonoates (PHA); or non-biodegradable (biobased polyethylene (bio-PE), polypropylene (bio-PP), polyethylene terephthalate (bio-PET)). The usage of such bioplastics is expected to increase in the future due to new found interest in sustainable materials. At the same time, these plastics become a new type of waste in the recycling stream. Most countries do not have separate bioplastics collection for it to be recycled or composted. After a brief introduction of bioplastics such as PLA in the UK, these plastics are once again replaced by conventional plastics by many establishments due to lack of commercial composting. Recycling companies fear the contamination of conventional plastic in the recycling stream and they said they would have to invest in expensive new equipment to separate bioplastics and recycle it separately. This project studies what happens when bioplastics contaminate conventional plastics. Three commonly used conventional plastics were selected for this study: polyethylene (PE), polypropylene (PP) and polyethylene terephthalate (PET). In order to simulate contamination, two biopolymers, either polyhydroxyalkanoate (PHA) or thermoplastic starch (TPS) were blended with the conventional polymers. The amount of bioplastics in conventional plastics was either 1% or 5%. The blended plastics were processed again to see the effect of degradation. The results from contamination showed that the tensile strength and the modulus of PE was almost unaffected whereas the elongation is clearly reduced indicating the increase in brittleness of the plastic. Generally, it can be said that PP is slightly more sensitive to the contamination than PE. This can be explained by the fact that the melting point of PP is higher than for PE and as a consequence, the biopolymer will degrade more quickly. However, the reduction of the tensile properties for PP is relatively modest. Impact strength is generally a more sensitive test method towards contamination. Again, PE is relatively unaffected by the contamination but for PP there is a relatively large reduction of the impact properties already at 1% contamination. PET is polyester, and it is, by its very nature, more sensitive to degradation than PE and PP. PET also has a much higher melting point than PE and PP, and as a consequence, the biopolymer will quickly degrade at the processing temperature of PET. As for the tensile strength, PET can tolerate 1% contamination without any reduction of the tensile strength. However, when the impact strength is examined, it is clear that already at 1% contamination, there is a strong reduction of the properties. The thermal properties show the change in the crystallinity. The blends were also characterized by SEM. Biphasic morphology can be seen as the two polymers are not truly blendable which also contributes to reduced mechanical properties. The study shows that PE is relatively robust against contamination, while polypropylene (PP) is sensitive and polyethylene terephthalate (PET) can be quite sensitive towards contamination.Keywords: bioplastics, contamination, recycling, waste management
Procedia PDF Downloads 2254222 Critical Assessment of Herbal Medicine Usage and Efficacy by Pharmacy Students
Authors: Anton V. Dolzhenko, Tahir Mehmood Khan
Abstract:
An ability to make an evidence-based decision is a critically important skill required for practicing pharmacists. The development of this skill is incorporated into the pharmacy curriculum. We aimed in our study to estimate perception of pharmacy students regarding herbal medicines and their ability to assess information on herbal medicines professionally. The current Monash University curriculum in Pharmacy does not provide comprehensive study material on herbal medicines and students should find their way to find information, assess its quality and make a professional decision. In the Pharmacy course, students are trained how to apply this process to conventional medicines. In our survey of 93 undergraduate students from year 1-4 of Pharmacy course at Monash University Malaysia, we found that students’ view on herbal medicines is sometimes associated with common beliefs, which affect students’ ability to make evidence-based conclusions regarding the therapeutic potential of herbal medicines. The use of herbal medicines is widespread and 95.7% of the participated students have prior experience of using them. In the scale 1 to 10, students rated the importance of acquiring herbal medicine knowledge for them as 8.1±1.6. More than half (54.9%) agreed that herbal medicines have the same clinical significance as conventional medicines in treating diseases. Even more, students agreed that healthcare settings should give equal importance to both conventional and herbal medicine use (80.6%) and that herbal medicines should comply with strict quality control procedures as conventional medicines (84.9%). The latter statement also indicates that students consider safety issues associated with the use of herbal medicines seriously. It was further confirmed by 94.6% of students saying that the safety and toxicity information on herbs and spices are important to pharmacists and 95.7% of students admitting that drug-herb interactions may affect therapeutic outcome. Only 36.5% of students consider herbal medicines as s safer alternative to conventional medicines. The students use information on herbal medicines from various sources and media. Most of the students (81.7%) obtain information on herbal medicines from the Internet and only 20.4% mentioned lectures/workshop/seminars as a source of such information. Therefore, we can conclude that students attained the skills on the critical assessment of therapeutic properties of conventional medicines have a potential to use their skills for evidence-based decisions regarding herbal medicines.Keywords: evidence-based decision, pharmacy education, student perception, traditional medicines
Procedia PDF Downloads 2824221 Advanced CoMP Scheme for LTE-based V2X System
Authors: Su-Hyun Jung, Young-Su Ryu, Yong-Jun Kim, Hyoung-Kyu Song
Abstract:
In this paper, a highly efficient coordinated multiple-point (CoMP) scheme for vehicular communication is proposed. The proposed scheme controls the transmit power and applies proper transmission scheme for the various situations. The proposed CoMP scheme provides comparable performance to the conventional dynamic cell selection (DCS) scheme. Moreover, this scheme provides improved power efficiency compared with the conventional joint transmission (JT) scheme. Simulation results show that the proposed scheme can achieve more enhanced performance with the high power efficiency and improve the cell capacity.Keywords: CoMP, LTE-A, V2I, V2V, V2X.
Procedia PDF Downloads 5834220 Impact of Organic Farming on Soil Fertility and Microbial Activity
Authors: Menuka Maharjan
Abstract:
In the name of food security, agriculture intensification through conventional farming is being implemented in Nepal. Government focus on increasing agriculture production completely ignores soil as well human health. This leads to create serious soil degradation, i.e., reduction of soil fertility and microbial activity and health hazard in the country. On this note, organic farming is sustainable agriculture approach which can address challenge of sustaining food security while protecting the environment. This creates a win-win situation both for people and the environment. However, people have limited knowledge on significance of organic farming for environment conservation and food security especially developing countries like Nepal. Thus, the objective of the study was to assess the impacts of organic farming on soil fertility and microbial activity compared to conventional farming and forest in Chitwan, Nepal. Total soil organic carbon (C) was highest in organic farming (24 mg C g⁻¹ soil) followed by conventional farming (15 mg C g⁻¹ soil) and forest (9 mg C g⁻¹ soil) in the topsoil layer (0-10 cm depth). A similar trend was found for total nitrogen (N) content in all three land uses with organic farming soil possessing the highest total N content in both 0-10 cm and 10-20 cm depth. Microbial biomass C and N were also highest under organic farming, especially in the topsoil layer (350 and 46 mg g⁻¹ soil, respectively). Similarly, microbial biomass phosphorus (P) was higher (3.6 and 1.0 mg P kg⁻¹ at 0-10 and 10-20 cm depth, respectively) in organic farming compared to conventional farming and forest at both depths. However, conventional farming and forest soils had similar microbial biomass (C, N, and P) content. After conversion of forest, the P stock significantly increased by 373% and 170% in soil under organic farming at 0-10 and 10-20 cm depth, respectively. In conventional farming, the P stock increased by 64% and 36% at 0-10 cm and 10-20 cm depth, respectively, compared to forest. Overall, organic farming practices, i.e., crop rotation, residue input and farmyard manure application, significantly alters soil fertility and microbial activity. Organic farming system is emerging as a sustainable land use system which can address the issues of food security and environment conservation by increasing sustainable agriculture production and carbon sequestration, respectively, supporting to achieve goals of sustainable development.Keywords: organic farming, soil fertility, micobial biomas, food security
Procedia PDF Downloads 1764219 The Effect of Ethylene Glycol on Cryopreserved Bovine Oocytes
Authors: Sri Wahjuningsih, Nur Ihsan, Hadiah
Abstract:
In the embryo transfer program, to address the limited production of embryos in vivo, in vitro embryo production has become an alternative approach that is relatively inexpensive. One potential source of embryos that can be developed is to use immature oocytes then conducted in vitro maturation and in vitro fertilization. However, obstacles encountered were oocyte viability mammals have very limited that it cannot be stored for a long time, so we need oocyte cryopreservation. The research was conducted to know the optimal concentration use of ethylene glycol as a cryoprotectant on oocytes freezing.Material use in this research was immature oocytes; taken from abbatoir which was aspirated from follicle with diameter 2-6 mm. Concentration ethylen glycol used were 0,5 M, I M, 1,5 M and 2M. The freezing method used was conventional method combined with a five-step protocol washing oocytes from cryoprotectant after thawing. The result showed that concentration ethylen glycol have the significant effect (P<0.05) on oocytes quality after thawing and in vitro maturation. It was concluded that concentration 1,5 M was the best concentration for freezing oocytes using conventional method.Keywords: bovine, conventional freezing, ethylen glycol, oocytes
Procedia PDF Downloads 3644218 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods
Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno
Abstract:
Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management
Procedia PDF Downloads 4984217 Asymmetric Warfare: Exploratory Study of the Implicit Defense Strategy of the People's Republic of China in 2012-2016
Authors: María Victoria Alvarez Magañini, Lautaro Nahuel Rubbi
Abstract:
According to different theories, the hegemonic war between the United States and the People's Republic of China seems to be imminent. However, nowadays, it is clear that China's conventional military capacity is inferior to that of the United States. Nevertheless, the conditions that in the past were considered to be an indicator of validity in asymmetrical warfare, at present, in a possible asymmetric war scenario, are no longer considered to be taken as such. The military capacity is not the only concept that represents the main indicator of victory. The organisation and the use of forces are also an essential part of it. The present paper aims to analyze the Chinese Defense Strategy in relation to the concept of asymmetric warfare in the face of a possible war with the United States. The starting point will be developed on the basis of application of the theory which corresponds to the concept aforementioned making focus on recent developments of the People’s Republic of China in the field of non-conventional defense. A comparative analysis of the conventional forces of both powers/countries will also be carried out.Keywords: asymmetric warfare, China, United States, hegemonic warfare
Procedia PDF Downloads 2644216 Comparative Study of Fenton and Activated Carbon Treatment for Dyeing Waste Water
Authors: Prem Mohan, Namrata Jariwala
Abstract:
In recent years 10000 dyes are approximately used by dying industry which makes dyeing wastewater more complex in nature. It is very difficult to treat dyeing wastewater by conventional methods. Here an attempt has been made to treat dyeing wastewater by the conventional and advanced method for removal of COD. Fenton process is the advanced method and activated carbon treatment is the conventional method. Experiments have been done on synthetic wastewater prepared from three different dyes; acidic, disperse and reactive. Experiments have also been conducted on real effluent obtained from industry. The optimum dose of catalyst and hydrogen peroxide in Fenton process and optimum activated carbon dose for each of these wastewaters were obtained. In Fenton treatment, COD removal was obtained up to 95% whereas 70% removal was obtained with activated carbon treatment.Keywords: activated carbon, advanced oxidation process, dyeing waste water, fenton oxidation process
Procedia PDF Downloads 2114215 Perceptions of Corporate Governance and Business Ethics Practices in Kuwaiti Islamic and Conventional Banks
Authors: Khaled Alotaibi, Salah Alhamadi, Ibraheem Almubarak
Abstract:
The study attempts to explore both corporate governance (GC) and business ethics (BE) practices in Kuwaiti banks and the relationship between CG and BE, using an accountability framework. By examining the perceptions of key stakeholder groups, this study investigates the practices of BE and CG in Islamic banks (IBs) compared to conventional banks (CBs). We contribute to the scarce studies concerned with relations between CG and BE. We have employed a questionnaire survey method for a random sample of crucial relevant stakeholder groups. The empirical analysis of the participants’ perceptions highlights the importance of applying CG regulations and BE for Kuwaiti banks and the clear link between the two concepts. We find that the main concern is not the absence of CG and BE codes, but the lack of consistent enforcement of the regulations. Such a system needs to be strictly and effectively implemented in Kuwaiti banks to protect all stakeholders’ wealth, not only that of stockholders. There are significant patterns in the CG and BE expectations among different stakeholder groups. Most interestingly, banks’ client groups illustrate high expectations concerning CG and BE practices.Keywords: corporate governance, GC, business ethics, BE, Islamic banks, IBs, conventional banks, CBs, accountability
Procedia PDF Downloads 1554214 Towards a Resources Provisioning for Dynamic Workflows in the Cloud
Authors: Fairouz Fakhfakh, Hatem Hadj Kacem, Ahmed Hadj Kacem
Abstract:
Cloud computing offers a new model of service provisioning for workflow applications, thanks to its elasticity and its paying model. However, it presents various challenges that need to be addressed in order to be efficiently utilized. The resources provisioning problem for workflow applications has been widely studied. Nevertheless, the existing works did not consider the change in workflow instances while they are being executed. This functionality has become a major requirement to deal with unusual situations and evolution. This paper presents a first step towards the resources provisioning for a dynamic workflow. In fact, we propose a provisioning algorithm which minimizes the overall workflow execution cost, while meeting a deadline constraint. Then, we extend it to support the dynamic adding of tasks. Experimental results show that our proposed heuristic demonstrates a significant reduction in resources cost by using a consolidation process.Keywords: cloud computing, resources provisioning, dynamic workflow, workflow applications
Procedia PDF Downloads 2954213 Inclusion and Changes of a Research Criterion in the Institute for Quality and Accreditation of Computing, Engineering and Technology Accreditation Model
Authors: J. Daniel Sanchez Ruiz
Abstract:
The paper explains why and how a research criterion was included within an accreditation system for undergraduate engineering programs, in spite of not being a common practice of accreditation agencies at a global level. This paper is divided into three parts. The first presents the context and the motivations that led the Institute for Quality and Accreditation of Computing, Engineering and Technology Programs (ICACIT) to add a research criterion. The second describes the criterion adopted and the feedback received during 2017 accreditation cycle. The third, the author proposes changes to the accreditation criteria that respond in a pertinent way to the results-based accreditation model and the national context. The author seeks to reconcile an outcome based accreditation model, aligned with the established by the International Engineering Alliance, with the particular context of higher education in Peru.Keywords: accreditation, engineering education, quality assurance, research
Procedia PDF Downloads 2814212 Assessing the Financial Potential of an Agroforestry-Based Farming Practice in a Labor Scarce Subsistence Economy
Authors: Arun Dhakal, Rajesh Kumar Rai
Abstract:
Agroforestry is long practiced in Nepal as a means of subsistence livelihoods. Given its potential to climate change mitigation, this practice is being recommended as a climate-smart farming practice in the recent years. However, the financial attractiveness of this practice is not well-documented in a labor scarce economy such as Nepal. This study attempts to examine the financial suitability of an agroforestry-based farming practice in the present socio-economic context of Nepal where labor is in short supply. A total of 200 households were randomly selected for household surveys in Dhanusha district during April to July 2015. Two farming practices were found to be dominant in the study area: 1) conventional farming (field crops only) in which at least two field crops are annually grown, and 2) agroforestry-based farming (agroforest, home garden and field crops combined) practice (ABFP). The ABFP was found to be less labor intensive than the conventional farming (137 Man days/yr/ha vs 218 Man days/yr/ha). The ex-ante financial analysis indicated that both the farming practices generated positive NPVs (Net Present Values) and B/C (Benefit-Cost) ratios greater than one, indicating both are financially attractive farming enterprises under the base discount rate of 12%. However, the ABFP generated higher NPV and greater B/C ratio than the conventional farming, indicating the former was financially more attractive than the later. The sensitivity analysis showed that the conventional farming was more sensitive to change in labor wage rate than that of the ABFP. Up to the 24% discount rate, the ABFP generated higher NPV and in case of B/C ratio, the ratio was found greater for ABFP even in 50% discount rate.Keywords: agroforestry, benefit-cost analysis, conventional farming, net present value
Procedia PDF Downloads 1334211 Optimization of Topology-Aware Job Allocation on a High-Performance Computing Cluster by Neural Simulated Annealing
Authors: Zekang Lan, Yan Xu, Yingkun Huang, Dian Huang, Shengzhong Feng
Abstract:
Jobs on high-performance computing (HPC) clusters can suffer significant performance degradation due to inter-job network interference. Topology-aware job allocation problem (TJAP) is such a problem that decides how to dedicate nodes to specific applications to mitigate inter-job network interference. In this paper, we study the window-based TJAP on a fat-tree network aiming at minimizing the cost of communication hop, a defined inter-job interference metric. The window-based approach for scheduling repeats periodically, taking the jobs in the queue and solving an assignment problem that maps jobs to the available nodes. Two special allocation strategies are considered, i.e., static continuity assignment strategy (SCAS) and dynamic continuity assignment strategy (DCAS). For the SCAS, a 0-1 integer programming is developed. For the DCAS, an approach called neural simulated algorithm (NSA), which is an extension to simulated algorithm (SA) that learns a repair operator and employs them in a guided heuristic search, is proposed. The efficacy of NSA is demonstrated with a computational study against SA and SCIP. The results of numerical experiments indicate that both the model and algorithm proposed in this paper are effective.Keywords: high-performance computing, job allocation, neural simulated annealing, topology-aware
Procedia PDF Downloads 1164210 Enabling UDP Multicast in Cloud IaaS: An Enterprise Use Case
Authors: Patrick J. Kerpan, Ryan C. Koop, Margaret M. Walker, Chris P. Swan
Abstract:
The User Datagram Protocol (UDP) multicast is a vital part of data center networking that is being left out of major cloud computing providers' network infrastructure. Enterprise users rely on multicast, and particularly UDP multicast to create and connect vital business operations. For example, UPD makes a variety of business functions possible from simultaneous content media updates, High-Performance Computing (HPC) grids, and video call routing for massive open online courses (MOOCs). Essentially, UDP multicast's technological slight is causing a huge effect on whether companies choose to use (or not to use) public cloud infrastructure as a service (IaaS). Allowing the ‘chatty’ UDP multicast protocol inside a cloud network could have a serious impact on the performance of the cloud as a whole. Cloud IaaS providers solve the issue by disallowing all UDP multicast. But what about enterprise use cases for multicast applications in organizations that want to move to the cloud? To re-allow multicast traffic, enterprises can build a layer 3 - 7 network over the top of a data center, private cloud, or public cloud. An overlay network simply creates a private, sealed network on top of the existing network. Overlays give complete control of the network back to enterprise cloud users the freedom to manage their network beyond the control of the cloud provider’s firewall conditions. The same logic applies if for users who wish to use IPsec or BGP network protocols inside or connected into an overlay network in cloud IaaS.Keywords: cloud computing, protocols, UDP multicast, virtualization
Procedia PDF Downloads 5904209 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System
Authors: Tahsin A. H. Nishat, Raquib Ahsan
Abstract:
Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system
Procedia PDF Downloads 1374208 Preliminary Conceptions of 3D Prototyping Model to Experimental Investigation in Hypersonic Shock Tunnels
Authors: Thiago Victor Cordeiro Marcos, Joao Felipe de Araujo Martos, Ronaldo de Lima Cardoso, David Romanelli Pinto, Paulo Gilberto de Paula Toro, Israel da Silveira Rego, Antonio Carlos de Oliveira
Abstract:
Currently, the use of 3D rapid prototyping, also known as 3D printing, has been investigated by some universities around the world as an innovative technique, fast, flexible and cheap for a direct plastic models manufacturing that are lighter and with complex geometries to be tested for hypersonic shock tunnel. Initially, the purpose is integrated prototyped parts with metal models that actually are manufactured through of the conventional machining and hereafter replace them with completely prototyped models. The mechanical design models to be tested in hypersonic shock tunnel are based on conventional manufacturing processes, therefore are limited forms and standard geometries. The use of 3D rapid prototyping offers a range of options that enables geometries innovation and ways to be used for the design new models. The conception and project of a prototyped model for hypersonic shock tunnel should be rethought and adapted when comparing the conventional manufacturing processes, in order to fully exploit the creativity and flexibility that are allowed by the 3D prototyping process. The objective of this paper is to compare the conception and project of a 3D rapid prototyping model and a conventional machining model, while showing the advantages and disadvantages of each process and the benefits that 3D prototyping can bring to the manufacture of models to be tested in hypersonic shock tunnel.Keywords: 3D printing, 3D prototyping, experimental research, hypersonic shock tunnel
Procedia PDF Downloads 4694207 Energy Saving Potential of a Desiccant-Based Indirect-Direct Evaporative Cooling System
Authors: Amirreza Heidari, Akram Avami, Ehsan Heidari
Abstract:
Evaporative cooling systems are known as energy efficient cooling systems, with much lower electricity consumption than conventional vapor compression systems. A serious limitation of these systems, however, is that they are not applicable in humid regions. Combining a desiccant wheel with these systems, known as desiccant-based evaporative cooling systems, makes it possible to use evaporative cooling in humid climates. This paper evaluates the performane of a cooling system combining desiccant wheel, direct and indirect evaporative coolers (called desiccant-based indirect-direct evaporative cooling (DIDE) system) and then evaluates the energy saving potential of this system over the conventional vapor compression cooling and drying system. To illustrate the system ability of providing comfort conditions, a dynamic hourly simulation of this system is performed for a typical 60 m² building in Sydney, Australia. To evaluate the energy saving potential of this system, a conventional cooling and drying system is also simulated for the same cooling capacity. It has been found that the DIE system is able to provide comfort temperature and relative humidity in a subtropical humid climate like Sydney. The electricity and natural gas consumption of this system are respectively 39.2% and 2.6% lower than that of conventional system over a week. As the research has demonstrated, the innovative DIDE system is an energy efficient cooling system for subtropical humid regions.Keywords: desiccant, evaporative cooling, dehumidification, indirect evaporative cooler
Procedia PDF Downloads 151